Comparison of XGboost, Extra Trees, and LightGBM with SMOTE for Fetal Health Classification

Kartika Handayani, Badariatul Lailiah

Abstract


Cardiotocography (CTG) is widely used by obstetricians to physically access the condition of the fetus during pregnancy. This can provide data to the obstetrician about fetal heart measurements and uterine duration which helps determine whether the fetus is pathological or not. Determining the pathological classification or not can be done using machine learning methods. In this research, there is a problem of unbalanced data or data imbalance. To overcome data instability, testing using SMOTE is used. Then a comparison was made with the classifications, namely XGboost, Extra Trees and LightGBM. XGboost, Extra Trees and LightGBM testing results using SMOTE obtained the best results at 91.52% accuracy, 90.49% recall and 89.12% f1-score produced by LightGBM. Meanwhile, the best results were 89.07% precision and AUC 0.9800 produced by Extra Trees.

Full Text:

PDF

References


N. P. R. Aryawati and M. K. Sri Budhi, “Pengaruh Produksi, Luas Lahan, dan Pendidikan Terhadap Pendapatan Petani dan Alih Fungsi Lahan Provinsi Bali,” E-Jurnal EP UNUD, vol. 7, no. 9, pp. 1918–1952, 2018.

E. Y. Dewi, E. Yuliani, and B. Rahman, “Analisis Peran Sektor Pertanian terhadap Pertumbuhan Perekonomian Wilayah,” J. Kaji. Ruang, vol. 2, no. 2, p. 229, 2022, doi: 10.30659/jkr.v2i2.20961.

S. I. Kusumaningrum, “Pemanfaatan Sektor Pertanian sebagai Penunjang Pertumbuhan Perekonomian Indonesia,” J. Transaksi, vol. 11, no. 1, pp. 80–89, 2019, [Online]. Available: http://ejournal.atmajaya.ac.id/index.php/transaksi/article/view/477.

H. Tamsan and Y. Yusriadi, “Quality of Agricultural Extension on Productivity of Farmers: Human Capital Perspective,” Uncertain Supply Chain Manag., vol. 10, no. 2, pp. 625–636, 2022, doi: 10.5267/j.uscm.2021.11.003.

H. Fitriyah and R. Maulana, “Deteksi Gulma berdasarkan Warna HSV dan Fitur Bentuk menggunakan Jaringan Syaraf Tiruan,” J. Teknol. Inf. dan Ilmu Komput., vol. 8, no. 5, p. 929, 2021, doi: 10.25126/jtiik.2021854719.

I. Vidya, L. Twenty, Krisyetno, and P. D. Satriyo, “Identifikasi Keragaman dan Dominansi Gulma pada Lahan Pertanaman Kedelai,” Agrotechnology Res. J., vol. 4, no. 1, 2020, doi: 10.20961/agrotechresj.v4i1.36449.

J. Zhao, G. Tian, C. Qiu, B. Gu, K. Zheng, and Q. Liu, “Weed Detection in Potato Fields Based on Improved YOLOv4: Optimal Speed and Accuracy of Weed Detection in Potato Fields,” Electron., vol. 11, no. 22, 2022, doi: 10.3390/electronics11223709.

B. Liu and R. Bruch, “Weed Detection for Selective Spraying: a Review,” Curr. Robot. Reports, vol. 1, no. 1, pp. 19–26, 2020, doi: 10.1007/s43154-020-00001-w.

C. T. Selvi, R. S. Sankara Subramanian, and R. Ramachandran, “Weed Detection in Agricultural Fields using Deep Learning Process,” 2021 7th Int. Conf. Adv. Comput. Commun. Syst. ICACCS 2021, pp. 1470–1473, 2021, doi: 10.1109/ICACCS51430.2021.9441683.

V. K. Sharma and R. N. Mir, “A Comprehensive and Systematic Look Up into Deep Learning based Object Detection Techniques: A review,” Comput. Sci. Rev., vol. 38, p. 100301, 2020, doi: 10.1016/j.cosrev.2020.100301.

J. Yu, A. W. Schumann, Z. Cao, S. M. Sharpe, and N. S. Boyd, “Weed Detection in Perennial Ryegrass with Deep Learning Convolutional Neural Network,” Front. Plant Sci., vol. 10, Oct. 2019, doi: 10.3389/fpls.2019.01422.

V. Kumar and M. L., “Deep Learning as a Frontier of Machine Learning: A Review,” Int. J. Comput. Appl., vol. 182, no. 1, pp. 22–30, 2018, doi: 10.5120/ijca2018917433.

L. Alzubaidi et al., “Review of Deep Learning: Concepts, CNN Architectures, Challenges, Applications, Future Directions,” J. Big Data, vol. 8, no. 1, 2021, doi: 10.1186/s40537-021-00444-8.

Y. Zhang, C. Song, and D. Zhang, “Deep Learning-based Object Detection Improvement for Tomato Disease,” IEEE Access, vol. 8, pp. 56607–56614, 2020, doi: 10.1109/ACCESS.2020.2982456.

Y. Su, D. Li, and X. Chen, “Lung Nodule Detection based on Faster R-CNN Framework,” Comput. Methods Programs Biomed., vol. 200, p. 105866, 2021, doi: 10.1016/j.cmpb.2020.105866.

J. Yu, S. M. Sharpe, A. W. Schumann, and N. S. Boyd, “Deep Learning for Image-based Weed Detection in Turfgrass,” Eur. J. Agron., vol. 104, no. November 2018, pp. 78–84, 2019, doi: 10.1016/j.eja.2019.01.004.

M. Sportelli et al., “Evaluation of YOLO Object Detectors for Weed Detection in Different Turfgrass Scenarios,” Appl. Sci., vol. 13, no. 14, 2023, doi: 10.3390/app13148502.

Y. Li, S. Zhang, and W. Q. Wang, “A Lightweight Faster R-CNN for Ship Detection in SAR Images,” IEEE Geosci. Remote Sens. Lett., vol. 19, pp. 1–5, 2022, doi: 10.1109/LGRS.2020.3038901.

S. Singh, U. Ahuja, M. Kumar, K. Kumar, and M. Sachdeva, “Face Mask Detection using YOLOv3 and Faster R-CNN Models: COVID-19 Environment,” Multimed. Tools Appl., vol. 80, no. 13, pp. 19753–19768, 2021, doi: 10.1007/s11042-021-10711-8.

M. Maity, S. Banerjee, and S. Sinha Chaudhuri, “Faster R-CNN and YOLO based Vehicle detection: A Survey,” Proc. - 5th Int. Conf. Comput. Methodol. Commun. ICCMC 2021, pp. 1442–1447, 2021, doi: 10.1109/ICCMC51019.2021.9418274.




DOI: https://doi.org/10.32520/stmsi.v13i3.3646

Article Metrics

Abstract view : 75 times
PDF - 40 times

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.