Interpretable and Robust Ensemble Deep Learning Framework for Tea Leaf Disease Classification


Öztürk O., Şeker D. Z., Sarıca B.

Horticulturae, cilt.17, sa.4, ss.437-461, 2025 (SCI-Expanded)

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 17 Sayı: 4
  • Basım Tarihi: 2025
  • Doi Numarası: 10.3390/horticulturae11040437
  • Dergi Adı: Horticulturae
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Agricultural & Environmental Science Database, BIOSIS, CAB Abstracts, Food Science & Technology Abstracts, Directory of Open Access Journals
  • Sayfa Sayıları: ss.437-461
  • Recep Tayyip Erdoğan Üniversitesi Adresli: Evet

Özet

Tea leaf diseases are among the most critical factors affecting the yield and quality of tea harvests. Due to climate change and widespread pesticide use in tea cultivation, these diseases have become more prevalent. As the demand for high-quality tea continues to rise, tea has assumed an increasingly prominent role in the global economy, thereby rendering the continuous monitoring of leaf diseases essential for maintaining crop quality and ensuring sustainable production. In this context, developing innovative and sustainable agricultural policies is vital. Integrating artificial intelligence (AI)-based techniques with sustainable agricultural practices presents promising solutions. Ensuring that the outputs of these techniques are interpretable would also provide significant value for decisionmakers, enhancing their applicability in sustainable agricultural practices. In this study, advanced deep learning architectures such as ResNet50, MobileNet, EfficientNetB0, and DenseNet121 were utilized to classify tea leaf diseases. Since low-resolution images and complex backgrounds caused significant challenges, an ensemble learning approach was proposed to combine the strengths of these models. The generalization performance of the ensemble model was comprehensively evaluated through statistical cross-validation. Additionally, Grad-CAM visualizations demonstrated a clear correspondence between diseased regions and disease types on the tea leaves. Thus, the models could detect diseases under varying conditions, highlighting their robustness. The ensemble model achieved high predictive performance, with precision, recall, and F1-score values of 95%, 94%, and 94% across folds. The overall classification accuracy reached 96%, with a maximum standard deviation of 2% across all dataset folds. Additionally, Grad-CAM visualizations demonstrated a clear correspondence between diseased regions and specific disease types on tea leaves, confirming the ability of models to detect diseases under varying conditions accurately and highlighting their robustness.