Comparative Analysis of Deep Learning Architectures for Indonesian Spice Image Classification

Fatma Meylinda Putri, Muhammad Imam Ghozali, Wibowo Harry Sugiharto

Abstract


Spices are an important commodity in Indonesia; however, visual identification remains challenging due to the similarity in appearance among different types of spices. This study develops an image classification system for Indonesian spices using transfer learning by comparing four Convolutional Neural Network (CNN) architectures: VGG16, ResNet50, EfficientNetB0, and MobileNetV2. The Indonesian Spices dataset consists of 31 classes with a total of 6,510 images, which were stratified and divided into training, validation, and testing sets. The training process was conducted in two stages: head-layer training and fine-tuning, with the application of regularization techniques such as dropout, batch normalization, and L2 regularization. The results show that ResNet50 achieved the best performance with a test accuracy of 95.80%, followed by VGG16 with 95.70%. EfficientNetB0 provided an optimal balance between accuracy (94.17%) and the fastest inference time (5.51 ms), while MobileNetV2 achieved an inference time of 6.07 ms with an accuracy of 92.63%, making it suitable for mobile devices. This study demonstrates the effectiveness of transfer learning for Indonesian spice image classification.

Keywords


image classification; Indonesian spices; transfer learning; convolutional neural network; deep learning

Full Text:

PDF

References


H. Anggrasari, P. Perdana, and J. H. Mulyo, “Keunggulan Komparatif dan Kompetitif Rempah-Rempah Indonesia di Pasar Internasional,” JURNAL AGRICA, Vol. 14, No. 1, pp. 9–19, Apr. 2021, DOI: 10.31289/agrica.v14i1.4396.

A. Adnan et al., “Diversity of Herbs and Spices Plants and Their Importance in Traditional Medicine in the South Aceh District, Indonesia,” Biodiversitas, Vol. 23, No. 7, Jul. 2022, DOI: 10.13057/biodiv/d230761.

K. Simonyan and A. Zisserman, “Very Deep Convolutional Networks for Large-Scale Image Recognition,” Apr. 2015.

S. J. Pan and Q. Yang, “A Survey on Transfer Learning,” IEEE Trans. Knowl. Data Eng., Vol. 22, No. 10, pp. 1345–1359, Oct. 2010, DOI: 10.1109/TKDE.2009.191.

A. E. Putra, M. F. Naufal, and V. R. Prasetyo, “Klasifikasi Jenis Rempah menggunakan Convolutional Neural Network dan Transfer Learning,” Jurnal Edukasi dan Penelitian Informatika, Vol. 9, No. 1, pp. 12–17, 2023.

I. N. Suandana, Asriyanik, and W. Apriandari, “Pemanfaatan CNN (Convolution Neural Network) dan MobileNetV2 dalam Klasifikasi Rempah-Rempah Lokal di Indonesia,” Oct. 2024.

R. Maulana, R. D. Z. Putri, T. A. Amelia, H. Syahputra, and F. Ramadhani, “Identifikasi Jenis Rempah-Rempah Indonesia dengan Convolutional Neural Network (CNN) menggunakan Arsitektur VGG16,” Aug. 2024.

M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L.-C. Chen, “MobileNetV2: Inverted Residuals and Linear Bottlenecks,” in 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, IEEE, Jun. 2018, pp. 4510–4520. DOI: 10.1109/CVPR.2018.00474.

X. Zhao, L. Wang, Y. Zhang, X. Han, M. Deveci, and M. Parmar, “A Review of Convolutional Neural Networks in Computer Vision,” Artif. Intell. Rev., Vol. 57, No. 4, Apr. 2024, DOI: 10.1007/s10462-024-10721-6.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognition,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Jun. 2016, pp. 770–778. DOI: 10.1109/CVPR.2016.90.

M. Tan and Q. V. Le, “EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks,” Sep. 2020.

M. N. A. Saputra, F. Liantoni, and D. Maryono, “Application of Convolutional Neural Network (CNN) using TensorFlow as a Learning Medium for Spice Classification,” Ultimatics : Jurnal Teknik Informatika, Vol. 16, No. 1, 2024.

C. Nisa and F. Candra, “Klasifikasi Jenis Rempah-Rempah menggunakan Algoritma Convolutional Neural Network,” MALCOM: Indonesian Journal of Machine Learning and Computer Science, Vol. 4, No. 1, pp. 78–84, Dec. 2023, DOI: 10.57152/malcom.v4i1.1018.

T. I. Simanjuntak, Muhathir, Fadlisyah, and I. Safira, “Performance Analysis of Naive Bayes Variation Method in Spice Image Classification using Histogram of Gradient Oriented (HOG) Feature Extraction,” Journal of Informatics and Telecommunication Engineering, Vol. 7, No. 1, pp. 282–291, Jul. 2023, DOI: 10.31289/jite.v7i1.7957.

Muhathir, R. T. Pangestu, I. Safira, and Melisah, “Performance Comparison of Boosting Algorithms in Spices Classification using Histogram of Oriented Gradient Feature Extraction,” Journal of Computer Science, Information Technology and Telecommunication Engineering, Vol. 4, No. 1, pp. 342–349, Mar. 2023, DOI: 10.30596/jcositte.v4i1.13710.

C. Shorten and T. M. Khoshgoftaar, “A Survey on Image Data Augmentation for Deep Learning,” J. Big Data, Vol. 6, No. 1, Dec. 2019, DOI: 10.1186/s40537-019-0197-0.

N. Srivastava, G. Hinton, A. Krizhevsky, and R. Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks from Overfitting,” 2014.

C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Jun. 2016, pp. 2818–2826. DOI: 10.1109/CVPR.2016.308.

J. Yosinski, J. Clune, Y. Bengio, and H. Lipson, “How Transferable are Features in Deep Neural Networks?,” Nov. 2014, [Online]. Available: http://arxiv.org/abs/1411.1792

A. Nathaniel, “Indonesian Spices Dataset,” Kaggle Dataset. Accessed: Dec. 01, 2025. [Online]. Available: https://www.kaggle.com/datasets/albertnathaniel12/indonesian-spices-dataset

J. D. Hunter, “Matplotlib: A 2D Graphics Environment,” Comput. SCI. Eng., Vol. 9, No. 3, pp. 90–95, 2007, DOI: 10.1109/MCSE.2007.55.

F. Pedregosa et al., “Scikit-Learn: Machine Learning in Python,” 2011. [Online]. Available: http://scikit-learn.sourceforge.net.

C. Szegedy et al., “Going Deeper with Convolutions,” Sep. 2014.

S. Ioffe and C. Szegedy, “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” Mar. 2015.

F. Chollet, Deep Learning with Python. 2017.

M. Lin, Q. Chen, and S. Yan, “Network in Network,” Mar. 2014.

A. Krogh• and J. A. Hertz, “A Simple Weight Decay can Improve Generalization.”

I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. Cambridge: MIT Press, 2016.

D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” Jan. 2017, [Online]. Available: http://arxiv.org/abs/1412.6980

A. S. Razavian, H. Azizpour, J. Sullivan, and S. Carlsson, “CNN Features off-the-Shelf: an Astounding Baseline for Recognition,” May 2014.

M. McCloskey and N. J. Cohen, “Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem,” 1989, pp. 109–165. DOI: 10.1016/S0079-7421(08)60536-8.

L. N. Smith, “Cyclical Learning Rates for Training Neural Networks,” Apr. 2017.

L. Prechelt, “Early Stopping — But When?,” 2012, pp. 53–67. DOI: 10.1007/978-3-642-35289-8_5.

D. M. W. Powers, “Evaluation: From Precision, Recall and F-Measure to Roc, Informedness, Markedness and Correlation,” Oct. 2020.

J. Brownlee, “What is a Confusion Matrix in Machine Learning,” Machine Learning Mastery.

M. Abadi et al., “TensorFlow: A system for Large-Scale Machine Learning,” May 2016.

M. Abadi et al., “TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems,” Mar. 2016, [Online]. Available: http://arxiv.org/abs/1603.04467




DOI: https://doi.org/10.32520/stmsi.v15i2.5946

Article Metrics

Abstract view : 6 times
PDF - 0 times

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.