Pengenalan Ekspresi Wajah Menggunakan Transfer Learning MobileNetV2 dan EfficientNet-B0 dalam Memprediksi Perkelahian

 (*)Ni Made Kirei Kharisma Handayani Mail (Universitas Dian Nuswantoro, Semarang, Indonesia)
 Erwin Yudi Hidayat (Universitas Dian Nuswantoro, Semarang, Indonesia)
 Muhammad Naufal (Universitas Dian Nuswantoro, Semarang, Indonesia)
 Permana Langgeng Wicaksono Ellwid Putra (Universitas Dian Nuswantoro, Semarang, Indonesia)

(*) Corresponding Author

Submitted: November 27, 2023; Published: January 9, 2024

Abstract

Expressions play an important role in recognizing someone's emotions. Recognizing emotions can help understand someone's condition and be a sign of their possible actions. Fighting is one of the violences that occur due to someone's negative emotions that need to be prevented and treated immediately. In this study, expression recognition is used to predict the possibility of a fight based on the expression shown by a person. The dataset used is FER-2013 which has been modified into two labels, namely "Yes" and "No". The data undergoes a preprocessing step which includes resizing and normalization. Model experiments using transfer learning from the MobileNetV2 and EfficientNet-B0 architectures have been modified by performing hyperparameter and fine tuning which includes freezing the layer by 25% in the first layers of each model and adding several layers such as flatten and dense. In the training process, some parameters used are 30 epochs, batch size 32, and Adam optimization with a learning rate of 0.0001. Model performance evaluation is measured using Confusion Matrix, then the results are compared and obtained the model that produces the best accuracy value is EfficientNet-B0 which is 82%. Meanwhile, based on the training time and model weight, MobileNetV2 is 1 hour 1 minute 43 seconds faster and 21.57 MB smaller than EfficientNet-B0.

Keywords


Facial Expressions; Fighting; Transfer Learning; MobileNetV2; EfficientNet-B0

Full Text:

PDF


Article Metrics

Abstract view : 181 times
PDF - 70 times

References

A. Mehrabian and S. R. Ferris, Inference of attitudes from nonverbal communication in two channels., J Consult Psychol, vol. 31, no. 3, pp. 248252, 1967, doi: 10.1037/h0024648.

D. A. Sauter, F. Eisner, P. Ekman, and S. K. Scott, Cross-cultural recognition of basic emotions through nonverbal emotional vocalizations, Proceedings of the National Academy of Sciences, vol. 107, no. 6, pp. 24082412, Feb. 2010, doi: 10.1073/pnas.0908239106.

C. Dalvi, M. Rathod, S. Patil, S. Gite, and K. Kotecha, A Survey of AI-Based Facial Emotion Recognition: Features, ML & DL Techniques, Age-Wise Datasets and Future Directions, IEEE Access, vol. 9, pp. 165806165840, 2021, doi: 10.1109/ACCESS.2021.3131733.

S. D. Walsh et al., Physical and Emotional Health Problems Experienced by Youth Engaged in Physical Fighting and Weapon Carrying, PLoS One, vol. 8, no. 2, p. e56403, Feb. 2013, doi: 10.1371/journal.pone.0056403.

World Health Organization : WHO, Youth violence, Oct. 11, 2023. Accessed: Dec. 07, 2023. [Online]. Available: https://www.who.int/news-room/fact-sheets/detail/youth-violence

C. Crivelli and A. J. Fridlund, Facial Displays Are Tools for Social Influence, Trends Cogn Sci, vol. 22, no. 5, pp. 388399, May 2018, doi: 10.1016/j.tics.2018.02.006.

N. AlMoghrabi, I. H. A. Franken, B. Mayer, M. van der Schoot, and J. Huijding, CBM-I training and its effect on interpretations of intent, facial expressions, attention and aggressive behavior, Eur J Psychol, vol. 17, no. 2, pp. 1327, May 2021, doi: 10.5964/ejop.2413.

S. Bae, E. Rhee, B. S. Hwang, Y. D. Son, J. H. Bae, and D. H. Han, Correlations Between Psychological Status and Perception of Facial Expression, Psychiatry Investig, vol. 19, no. 6, pp. 435442, Jun. 2022, doi: 10.30773/pi.2022.0025.

P. Kanagaraju, M. A. Ranjith, and K. Vijayasarathy, Emotion detection from facial expression using image processing, Int J Health Sci (Qassim), pp. 13681379, Jun. 2022, doi: 10.53730/ijhs.v6nS6.9748.

P. Babajee, G. Suddul, S. Armoogum, and R. Foogooa, Identifying Human Emotions from Facial Expressions with Deep Learning, in 2020 Zooming Innovation in Consumer Technologies Conference (ZINC), IEEE, May 2020, pp. 3639. doi: 10.1109/ZINC50678.2020.9161445.

F. Fikriansyah, A. L. Prasasti, and M. W. Paryasto, Prediksi Keputusan Juri Pencarian Bakat Berdasarkan Ekspresi Wajah Menggunakan Analis VGG16 dan Fuzzy, MULTINETICS, vol. 8, no. 2, pp. 8796, Nov. 2022, doi: 10.32722/multinetics.v8i2.4705.

N. Makanapura, C. Sujatha, P. R. Patil, and P. Desai, Classification of plant seedlings using deep convolutional neural network architectures, J Phys Conf Ser, vol. 2161, no. 1, p. 012006, Jan. 2022, doi: 10.1088/1742-6596/2161/1/012006.

T. Adar, E. K. Delice, and O. Delice, Detection of COVID-19 From A New Dataset Using MobileNetV2 and ResNet101V2 Architectures, in 2022 Medical Technologies Congress (TIPTEKNO), IEEE, Oct. 2022, pp. 14. doi: 10.1109/TIPTEKNO56568.2022.9960225.

T. Podder, D. Bhattacharya, and A. Majumdar, Time efficient real time facial expression recognition with CNN and transfer learning, S?dhan?, vol. 47, no. 3, p. 177, Aug. 2022, doi: 10.1007/s12046-022-01943-x.

E. Fernandez, A framework for understanding emotions in violent ethnic conflicts, Aggress Violent Behav, vol. 72, p. 101860, Sep. 2023, doi: 10.1016/j.avb.2023.101860.

M. A. Raihan, Jayanta, and M. M. Santoni, Face Recognition Using Convolutional Neural Network Architectures on Mask-Occluded Face Images, in 2021 International Conference on Informatics, Multimedia, Cyber and Information System (ICIMCIS, IEEE, Oct. 2021, pp. 301306. doi: 10.1109/ICIMCIS53775.2021.9699239.

M. Jain and B. K. Singh, Leveraging Lightweight Pretrained Model for Brain Tumour Detection, BIO Web Conf, vol. 65, p. 05051, Sep. 2023, doi: 10.1051/bioconf/20236505051.

A. Maharjan and A. Shakya, Learning Approaches used by Different Applications to Achieve Deep Fake Technology, Interdisciplinary Journal of Innovation in Nepalese Academia, vol. 2, pp. 96101, 2023.

L. Zahara, P. Musa, E. Prasetyo Wibowo, I. Karim, and S. Bahri Musa, The Facial Emotion Recognition (FER-2013) Dataset for Prediction System of Micro-Expressions Face Using the Convolutional Neural Network (CNN) Algorithm based Raspberry Pi, in 2020 Fifth International Conference on Informatics and Computing (ICIC), IEEE, Nov. 2020, pp. 19. doi: 10.1109/ICIC50835.2020.9288560.

S. Asif, Y. Wenhui, K. Amjad, H. Jin, Y. Tao, and S. Jinhai, Detection of COVID?19 from chest X?ray images: Boosting the performance with convolutional neural network and transfer learning, Expert Syst, vol. 40, no. 1, Jan. 2023, doi: 10.1111/exsy.13099.

Y. LeCun, Y. Bengio, and G. Hinton, Deep learning, Nature, vol. 521, no. 7553, pp. 436444, May 2015, doi: 10.1038/nature14539.

D. Hooshyar and Y. Yang, ImageLM: Interpretable image-based learner modelling for classifying learners computational thinking, Expert Syst Appl, vol. 238, p. 122283, Mar. 2024, doi: 10.1016/j.eswa.2023.122283.

Y. Ganin and V. Lempitsky, Unsupervised Domain Adaptation by Backpropagation, in International Conference on Machine Learning, Y. V. Ganin, Ed., 2015, pp. 11801189.

P. Langgeng, W. E. Putra, M. Naufal, and E. Y. Hidayat, A Comparative Study of MobileNet Architecture Optimizer for Crowd Prediction, Semarang 123 Jl. Imam Bonjol No, vol. 8, no. 3, p. 50131, 2023.

R. Anditto and R. Roestam, Security Monitoring Using Improved Mobilenet V2 With Fine-Tuning To Prevent Theft In Residential Areas During The Covid-19 Pandemic, SINTECH (Science and Information Technology) Journal, vol. 5, no. 1, pp. 8794, Apr. 2022, doi: 10.31598/sintechjournal.v5i1.1023.

M. Tan and Q. Le, EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks, in International conference on machine learning, May 2019, pp. 61056114. [Online]. Available: http://arxiv.org/abs/1905.11946

A. Setyanto, K. Kusrini, T. B. Sasongko, A. B. Permana, and A. P. Saputra, Efficient Deep Learning Architecture for Facemask Detection, in 2021 4th International Conference on Information and Communications Technology (ICOIACT), IEEE, Aug. 2021, pp. 119124. doi: 10.1109/ICOIACT53268.2021.9564011.

C.-T. Yen and K.-H. Li, Discussions of Different Deep Transfer Learning Models for Emotion Recognitions, IEEE Access, vol. 10, pp. 102860102875, 2022, doi: 10.1109/ACCESS.2022.3209813.

O. Ezerceli and M. T. Eskil, Convolutional Neural Network (CNN) Algorithm Based Facial Emotion Recognition (FER) System for FER-2013 Dataset, in 2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), IEEE, Nov. 2022, pp. 16. doi: 10.1109/ICECCME55909.2022.9988371.

A. F. Agarap, Deep Learning using Rectified Linear Units (ReLU), arXiv preprint arXiv:1803.08375, Mar. 2018, doi: https://doi.org/10.48550/arXiv.1803.08375.

A. Waheed, M. Goyal, D. Gupta, A. Khanna, A. E. Hassanien, and H. M. Pandey, An optimized dense convolutional neural network model for disease recognition and classification in corn leaf, Comput Electron Agric, vol. 175, p. 105456, Aug. 2020, doi: 10.1016/j.compag.2020.105456.

D. P. Kingma and J. Ba, Adam: A Method for Stochastic Optimization, ICLR, Dec. 2014, [Online]. Available: http://arxiv.org/abs/1412.6980

D. A. Pramudhita, F. Azzahra, I. K. Arfat, R. Magdalena, and S. Saidah, Strawberry Plant Diseases Classification Using CNN Based on MobileNetV3-Large and EfficientNet-B0 Architecture, Jurnal Ilmiah Teknik Elektro Komputer dan Informatika, vol. 9, no. 3, pp. 522534, 2023, doi: 10.26555/jiteki.v9i3.26341.

R. Arthana, Mengenal Accuracy, Precision, Recall dan Specificity serta yang diprioritaskan dalam Machine Learning, Medium, Apr. 05, 2019. Accessed: Nov. 01, 2023. [Online]. Available: https://rey1024.medium.com/mengenal-accuracy-precission-recall-dan-specificity-serta-yang-diprioritaskan-b79ff4d77de8

Bila bermanfaat silahkan share artikel ini

Berikan Komentar Anda terhadap artikel Pengenalan Ekspresi Wajah Menggunakan Transfer Learning MobileNetV2 dan EfficientNet-B0 dalam Memprediksi Perkelahian

Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 JURNAL MEDIA INFORMATIKA BUDIDARMA

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.



JURNAL MEDIA INFORMATIKA BUDIDARMA
STMIK Budi Darma
Secretariat: Sisingamangaraja No. 338 Telp 061-7875998
Email: mib.stmikbd@gmail.com

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.