Identification of Barangan Banana Ripeness Treatment Types using k-Nearest Neighbor

Abdullah Abdullah, Rendi Azrian


Bananas are favored by the public because bananas are rich in nutrients that our bodies need. One of the bananas that people are interested in is the Barangan. Bananas sold in the market have various types of ripeness that vary based on their treatment. This study aimed to identify the type of treatment for Barangan. Identification is carried out based on an analysis of the image of Barangan using color and texture features. The k-Nearest Neighbor (k-NN) method is used in the identification. The k-NN compares the similarity between the unknown data and the sample data. The k values used in this study are k=1, k=3, and k=5. The Euclidean Distance is used to measure the distance between 2 feature vectors. The classification test uses the holdout method, where the percentage of the amount of sample data and test data is 66.67% of training data and 33.33% of test data. The accuracy obtained at k=1 is 86.67%, at k=3 is 76.67%, and at k=5 is 80%. The best accuracy for identifying banana ripeness treatment types using the k-Nearest Neighbor method is obtained at k = 1, with accuracy reaching 86.67%.

Full Text:



K. Hameed, D. Chai, and A. Rassau, “A Progressive Weighted Average Weight Optimisation Ensemble Technique for Fruit and Vegetable Classification,” in 16th IEEE International Conference on Control, Automation, Robotics and Vision, ICARCV 2020, 2020, pp. 303–308, doi: 10.1109/ICARCV50220.2020.9305474.

Z. Alqadi, M. Khrisat, A. Hindi, and M. Dwairi, “Features Analysis of RGB Color Image based on Wavelet Packet Information,” 2020.

Claudio Cusano, P. Napoletano, and R. Schettini, “Combining multiple features for color texture classification,” J. Electron. Imaging, vol. 25, no. 6, pp. 1–9, 2016.

E. K. Sharma, E. Priyanka, E. A. Kalsh, and E. K. Saini, “GLCM and its Features,” Int. J. Adv. Researc Electron. Commun. Eng., vol. 4, no. 8, pp. 2180–2182, 2015.

Z. Zhang, “Introduction to Machine Learning: k-Nearest Neighbors,” Ann. Transl. Med., vol. 4, no. 11, pp. 1–7, 2016, doi: 10.21037/atm.2016.03.37.

T. Y. Prahudaya and A. Harjoko, “Metode Klasifikasi Mutu Jambu Biji Menggunakan Knn Berdasarkan Fitur Warna Dan Tekstur,” J. Teknosains, vol. 6, no. 2, pp. 113–123, 2017, doi: 10.22146/teknosains.26972.

P. R. Trisnaningtyas and Maimunah, “Klasifikasi Mutu Telur Berdasarkan Kebersihan Kerabang Telur Menggunakan K-Nearest Neighbor,” Konf. Nas. Inform. 2015 Klasifikasi, no. October 2015, pp. 241–245, 2015.

M. Bhat, “Digital Image Processing,” Int. J. Sci. Technol. Res., vol. 3, no. 1, pp. 272–276, 2014, [Online]. Available:

C. V. Angkoso, J. T. Informatika, F. Teknik, U. Trunojoyo, E. Fitur, and A. Tekstur, “Pengenalan Jender Berbasis Tekstur Pada Citra Wajah Foto Digital,” Konf. Nas. “Inovasi dalam Desain dan Teknol., pp. 119–125, 2011.

I. R. G. A. Sugiartha, M. Sudarma, and I. M. O. Widyantara, “Ekstraksi Fitur Warna , Tekstur dan Bentuk untuk Clustered-Based Retrieval of Images ( CLUE ),” Teknol. Elektro, vol. 16, no. 1, pp. 85–90, 2017.

A. Halim and J. T. Informatika, “Aplikasi Image Retrieval dengan Histogram Warna dan Multi- scale GLCM,” vol. 16, no. 1, pp. 41–50, 2015.

M. Kibanov, M. Becker, J. Mueller, M. Atzmueller, A. Hotho, and G. Stumme, “Adaptive kNN using Expected Accuracy for Classification of Geo-Spatial Data,” in Proceedings of Symposium on Applied Computing (SAC), 2017, pp. 1–9, doi: 10.1145/3167132.3167226.

E. López-Iñesta, F. Grimaldo, and M. Arevalillo-Herráez, “Classification similarity learning using feature-based and distance-based representations: A comparative study,” Appl. Artif. Intell., vol. 29, no. 5, pp. 445–458, 2015, doi: 10.1080/08839514.2015.1026658.

“Matrices and Arrays - MATLAB & Simulink,”, 2022.


S. Supiyanto and T. Suparwati, “Perbaikan Citra Menggunakan Metode Contrast Stretching,” J. Siger Mat., vol. 2, no. 1, pp. 13–18, 2021, doi: 10.23960/jsm.v2i1.2743.

S. Kamate and N. Yilmazer, “Application of Object Detection and Tracking Techniques for Unmanned Aerial Vehicles,” Procedia Comput. Sci., vol. 61, pp. 436–441, 2015, doi: 10.1016/j.procs.2015.09.183.

S. Kumar and J. S. Yadav, “Advances in Intelligent Systems and Computing,” in Proceeding of International Conference on Intelligent Communication, Control and Devices, 2017.

S. Zhang, X. Li, M. Zong, X. Zhu, and R. Wang, “Efficient kNN Classification With Different Numbers of Nearest Neighbors,” IEEE Trans. Neural Networks Learn. Syst., pp. 1–12, 2017, doi: 10.1109/TNNLS.2017.2673241.

L. Liberti, “Distance geometry and data science,” TOP, vol. 28, no. 2, pp. 271–339, 2020, doi: 10.1007/s11750-020-00563-0.

S. Raschka, Model Evaluation, Model Selection, and Algorithm Selection in Machine Learning. 2018.

P. Galdi and R. Tagliaferri, “Data Mining: Accuracy and Error Measures for Classification and Prediction,” in Reference Module in Life Sciences, no. January, Elsevier, 2019, pp. 1–14.

P. Thomas, H. Bril El Haouzi, M. C. Suhner, A. Thomas, E. Zimmermann, and M. Noyel, “Using a classifier ensemble for proactive quality monitoring and control: The impact of the choice of classifiers types, selection criterion, and fusion process,” Computers in Industry. 2018.


Article Metrics

Abstract view : 81 times
PDF - 26 times


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.