Analisis Perbandingan Seleksi Fitur dalam Memprediksi Kelulusan Mahasiswa dengan Menngunakan Artificial Neural Network

Authors

  • M. Khoirul Risqi Universitas Nahdlatul Ulama Sunan Giri, Bojonegoro
  • Ifnu Wisma Dwi Prastya Universitas Nahdlatul Ulama Sunan Giri, Bojonegoro
  • Mula Agung Barata Universitas Nahdlatul Ulama Sunan Giri, Bojonegoro

DOI:

https://doi.org/10.30865/jurikom.v13i1.9420

Keywords:

ANOVA, Artificial Neural Network, Chi-Square, Dropout, Information Gain

Abstract

Student attrition presents a major challenge in higher education due to its direct impact on academic quality and institutional graduation rates. Detecting students who are likely to withdraw at an early stage is therefore essential to ensure that timely interventions can be made. This study investigates how three distinct feature selection techniques—Chi-Square, Information Gain, and ANOVA—affect the performance of Artificial Neural Networks (ANN) in classifying student outcomes. The data used in the experiment were drawn from academic and administrative records, which had been standardized through Min-Max normalization. The results demonstrate that each method contributes positively, with classification accuracies ranging from 88.71% to 91.37%. Information Gain emerged as the most effective approach, yielding the highest accuracy at 91.37% and a recall score of 97.29%, largely due to its capability to reduce entropy and isolate the most informative variables. ANOVA also performed consistently well with 90.82% accuracy, while Chi-Square was comparatively less effective, potentially due to its reliance on categorical variables that may not capture predictive nuances. These findings emphasize the strategic importance of applying robust feature selection to improve ANN-based prediction models. Ultimately, this research supports the design of data-driven systems aimed at reducing student dropout rates and strengthening academic retention strategies across higher education institutions.

References

[1] A. Behr, M. Giese, H. D. Teguim Kamdjou, and K. Theune, “Motives for dropping out from higher education—An analysis of bachelor’s degree students in Germany,” Eur. J. Educ., vol. 56, no. 2, pp. 325–343, 2021, doi: 10.1111/ejed.12433.

[2] V. Realinho, J. Machado, L. Baptista, and M. V. Martins, “Predicting Student Dropout and Academic Success,” Data, vol. 7, no. 11, 2022, doi: 10.3390/data7110146.

[3] L. Hakim, A. Sobri, L. Sunardi, and D. Nurdiansyah, “Prediksi Penyakit Jantung Berbasis Mesin Learning Dengan Menggunakan Metode K-NN,” J. Digit. Teknol. Inf., vol. 07, no. 02, pp. 14–20, 2024.

[4] E. Haryatmi and S. Pramita Hervianti, “Penerapan Algoritma Support Vector Machine Untuk Model Prediksi Kelulusan Mahasiswa Tepat Waktu,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 5, no. 2, pp. 386–392, 2021, doi: 10.29207/resti.v5i2.3007.

[5] T. A. Yoga Siswa, “Komparasi Optimasi Chi-Square, CFS, Information Gain dan ANOVA dalam Evaluasi Peningkatan Akurasi Algoritma Klasifikasi Data Performa Akademik Mahasiswa,” Inform. Mulawarman J. Ilm. Ilmu Komput., vol. 18, no. 1, p. 62, 2023, doi: 10.30872/jim.v18i1.11330.

[6] S. Situju, N. Nur, and N. Halal, “Analisis Penerapan Mutual Information pada Klasifikasi Status Studi Mahasiswa Menggunakan Naïve Bayes,” J. Appl. Comput. Sci. Technol., vol. 6, no. 1, pp. 23–28, 2025, doi: 10.52158/jacost.v6i1.1106.

[7] M. O. Abdullah, Y. Altun, and R. M. Ahmed, “Leveraging Artificial Neural Networks and Support Vector Machines for Accurate Classification of Breast Tumors in Ultrasound Images,” vol. 16, no. 11, 2024, doi: 10.7759/cureus.73067.

[8] M. A. Jassim and S. N. Abdulwahid, “Data Mining preparation: Process, Techniques and Major Issues in Data Analysis,” IOP Conf. Ser. Mater. Sci. Eng., vol. 1090, no. 1, p. 012053, 2021, doi: 10.1088/1757-899x/1090/1/012053.

[9] U. M. Khaire and R. Dhanalakshmi, “Stability of feature selection algorithm: A review,” J. King Saud Univ. - Comput. Inf. Sci., vol. 34, no. 4, pp. 1060–1073, 2022, doi: 10.1016/j.jksuci.2019.06.012.

[10] L. Realinho, V., Vieira Martins, M., Machado, J., & Baptista, “Predict Students’ Dropout and Academic Success [Dataset],” UCI Machine Learning Repository. [Online]. Available: https://archive.ics.uci.edu/dataset/697/predict+students+dropout+and+academic+success

[11] M. F. Naufal and S. F. Kusuma, “Comparison Analysis of Machine Learning and Deep Learning Algorithms for Image Classification of Indonesian Language Signing Systems (Sibi),” Jtiik, vol. 10, no. 4, pp. 873–882, 2023, doi: 10.25126/jtiik.2023106828.

[12] A. Nurkholis, D. Alita, and A. Munandar, “Comparison of Kernel Support Vector Machine Multi-Class in PPKM Sentiment Analysis on Twitter,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 6, no. 2, pp. 227–233, 2022, doi: 10.29207/resti.v6i2.3906.

[13] S. D. Amalia, M. A. Barata, and P. E. Yuwita, “Optimization of Random Forest Algorithm with Backward Elimination Method in Classification of Academic Stress Levels,” vol. 9, no. 3, 2025.

[14] M. H. ur Rehman, C. S. Liew, A. Abbas, P. P. Jayaraman, T. Y. Wah, and S. U. Khan, “Big Data Reduction Methods: A Survey,” Data Sci. Eng., vol. 1, no. 4, pp. 265–284, 2016, doi: 10.1007/s41019-016-0022-0.

[15] G. A. B. Suryanegara, Adiwijaya, and M. D. Purbolaksono, “Peningkatan Hasil Klasifikasi pada Algoritma Random Forest untuk Deteksi,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 1, no. 10, pp. 114–122, 2021.

[16] A. Faqih, M. J. Vikri, and I. Aristia, “Optimizing Gated Recurrent Unit ( GRU ) for Gold Price Prediction : Hyperparameter Tuning and Model Evaluation on Historical XAU / USD Data,” vol. 14, pp. 141–147, 2025.

[17] Saputra, I. and D. A. Kristiyanti, Machine learning untuk pemula. Informatika Bandung, 2022.

[18] P. P. Allorerung, A. Erna, and M. Bagussahrir, “Analisis Performa Normalisasi Data untuk Klasifikasi K-Nearest Neighbor pada Dataset Penyakit,” vol. 9, no. 3, pp. 178–191, 2024.

[19] E. F. Laili et al., “KOMPARASI ALGORITMA DECISION TREE DAN SUPPORT VECTOR MACHINE ( SVM ) DALAM,” vol. 8, no. 1, pp. 67–76, 2025.

[20] S. R. Azizah, R. Herteno, A. Farmadi, D. Kartini, and I. Budiman, “Kombinasi Seleksi Fitur Berbasis Filter dan Wrapper Menggunakan Naive Bayes pada Klasifikasi Penyakit Jantung,” J. Teknol. Inf. dan Ilmu Komput., vol. 10, no. 6, pp. 1361–1368, 2023, doi: 10.25126/jtiik.2023107467.

[21] M. A. Barata, Edi Noersasongko, Purwanto, and Moch Arief Soeleman, “Improving the Accuracy of C4.5 Algorithm with Chi-Square Method on Pure Tea Classification Using Electronic Nose,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 7, no. 2, pp. 226–235, 2023, doi: 10.29207/resti.v7i2.4687.

[22] M. I. Prasetiyowati, N. U. Maulidevi, and K. Surendro, “Determining threshold value on information gain feature selection to increase speed and prediction accuracy of random forest,” J. Big Data, vol. 8, no. 1, 2021, doi: 10.1186/s40537-021-00472-4.

[23] M. M. Ali, M. S. Islam, M. N. Uddin, and M. A. Uddin, “A conceptual IoT framework based on Anova-F feature selection for chronic kidney disease detection using deep learning approach,” Intell. Med., vol. 10, no. October, p. 100170, 2024, doi: 10.1016/j.ibmed.2024.100170.

[24] A. I. Sakti et al., “Implementasi Artificial Neural Network ( ANN ) dalam Memprediksi Nilai Tukar Rupiah terhadap Dolar Amerika Implementasi Artificial Neural Network ( ANN ) dalam Memprediksi Nilai Tukar Rupiah terhadap Dolar Amerika,” vol. 12, no. 2, pp. 124–130, 2024.

[25] E. Sutoyo and A. Almaarif, “Educational Data Mining untuk Prediksi Kelulusan Mahasiswa Menggunakan Algoritme Naïve Bayes Classifier,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 4, no. 1, pp. 95–101, 2020, doi: 10.29207/RESTI.V4I1.1502.

[26] A. A. Yaqin, M. A. Barata, and N. Mahmudah, “Implementation of the Random Forest Algorithm with Optuna Optimization in Lung Cancer Classification,” vol. 14, pp. 561–569, 2025.

Additional Files

Published

2026-02-28

How to Cite

M. Khoirul Risqi, Dwi Prastya, I. W., & Barata, M. A. (2026). Analisis Perbandingan Seleksi Fitur dalam Memprediksi Kelulusan Mahasiswa dengan Menngunakan Artificial Neural Network . JURNAL RISET KOMPUTER (JURIKOM), 13(1), 210–220. https://doi.org/10.30865/jurikom.v13i1.9420