Integrating Heart Rate, Gyro, and Accelero Data for Enhanced Emotion Detection Post-Movie, Post-Music, and While-Music

Authors

DOI:

https://doi.org/10.30865/mib.v8i2.7634

Keywords:

Emotion, Smartwatch, Heartrate, Neural Network (NN), Human-Computer Interaction

Abstract

The study investigates the integration of heart rate, gyro, and accelerometer data to enhance emotion recognition across different scenarios like post-movie, post-music, and during music listening. Recognizing the limitations of solely using heart rate data, the research combines gyroscopic and accelerometer data to provide a more comprehensive understanding of emotional responses. Employing machine learning algorithms, notably support vector machines, the aim is to develop robust models for real-time emotion recognition. Conducting rigorous experimental protocols involving motion sensor data from smartwatches, a user study with 50 participants examines emotional responses during activities such as movie-watching and music listening. The dataset includes data from accelerometers, gyroscopes, and heart rate sensors, with additional evaluation metrics to assess the effectiveness of the proposed method in detecting emotional states. The findings demonstrate significant effectiveness of an innovative neural network (NN) method in determining post-activity emotional states, with accuracies ranging from 59.0% to 83.4%, depending on the activity and context. Although NN accuracy is slightly lower compared to other methods like random forest and logistic regression, the differences are not significant, especially when compared to logistic regression. Overall, the research aims to advance emotion recognition technology for applications in human-computer interaction contexts.

References

H. Lian, C. Lu, S. Li, Y. Zhao, C. Tang, and Y. Zong, “A Survey of Deep Learning-Based Multimodal Emotion Recognition: Speech, Text, and Face,†Entropy 2023, Vol. 25, Page 1440, vol. 25, no. 10, p. 1440, Oct. 2023, doi: 10.3390/E25101440.

N. Karizat, A. H. Vinson, S. Parthasarathy, and N. Andalibi, “Patent Applications as Glimpses into the Sociotechnical Imaginary: Ethical Speculation on the Imagined Futures of Emotion AI for Mental Health Monitoring and Detection,†Proc ACM Hum Comput Interact, vol. 8, no. CSCW1, pp. 1–43, Apr. 2024, doi: 10.1145/3637383.

A. Kołakowska, W. Szwoch, and M. Szwoch, “A Review of Emotion Recognition Methods Based on Data Acquired via Smartphone Sensors,†Sensors 2020, Vol. 20, Page 6367, vol. 20, no. 21, p. 6367, Nov. 2020, doi: 10.3390/S20216367.

A. Mahmoud, K. Amin, M. M. Al Rahhal, W. S. Elkilani, M. L. Mekhalfi, and M. Ibrahim, “A CNN Approach for Emotion Recognition via EEG,†Symmetry 2023, Vol. 15, Page 1822, vol. 15, no. 10, p. 1822, Sep. 2023, doi: 10.3390/SYM15101822.

M. Wankhade, A. C. S. Rao, and C. Kulkarni, “A survey on sentiment analysis methods, applications, and challenges,†Artificial Intelligence Review 2022 55:7, vol. 55, no. 7, pp. 5731–5780, Feb. 2022, doi: 10.1007/S10462-022-10144-1.

L. Pasquini et al., “Dynamic autonomic nervous system states arise during emotions and manifest in basal physiology,†Psychophysiology, vol. 60, no. 4, p. e14218, Apr. 2023, doi: 10.1111/PSYP.14218.

K. Kudrinko, E. Flavin, X. Zhu, and Q. Li, “Wearable Sensor-Based Sign Language Recognition: A Comprehensive Review,†IEEE Rev Biomed Eng, vol. 14, pp. 82–97, 2021, doi: 10.1109/RBME.2020.3019769.

C. Tan, G. Ceballos, N. Kasabov, and N. P. Subramaniyam, “FusionSense: Emotion Classification Using Feature Fusion of Multimodal Data and Deep Learning in a Brain-Inspired Spiking Neural Network,†Sensors 2020, Vol. 20, Page 5328, vol. 20, no. 18, p. 5328, Sep. 2020, doi: 10.3390/S20185328.

M. Reybrouck, “A Dynamic Interactive Approach to Music Listening: The Role of Entrainment, Attunement and Resonance,†Multimodal Technologies and Interaction 2023, Vol. 7, Page 66, vol. 7, no. 7, p. 66, Jun. 2023, doi: 10.3390/MTI7070066.

B. Chen, Q. Cao, M. Hou, Z. Zhang, G. Lu, and D. Zhang, “Multimodal Emotion Recognition with Temporal and Semantic Consistency,†IEEE/ACM Trans Audio Speech Lang Process, vol. 29, pp. 3592–3603, 2021, doi: 10.1109/TASLP.2021.3129331.

E. M. G. Younis, S. Mohsen, E. H. Hussein, and O. A. S. Ibrahim, “Machine learning for human emotion recognition: a comprehensive review,†Neural Computing and Applications 2024, pp. 1–47, Feb. 2024, doi: 10.1007/S00521-024-09426-2.

H. Boughanem, H. Ghazouani, and W. Barhoumi, “Facial Emotion Recognition in-the-Wild Using Deep Neural Networks: A Comprehensive Review,†SN Comput Sci, vol. 5, no. 1, pp. 1–28, Jan. 2024, doi: 10.1007/S42979-023-02423-7/METRICS.

H. Kumar and A. Martin, “Multimodal Approach: Emotion Recognition from Audio and Video Modality,†7th International Conference on Electronics, Communication and Aerospace Technology, ICECA 2023 - Proceedings, pp. 954–961, 2023, doi: 10.1109/ICECA58529.2023.10395255.

N. Ahmed, Z. Al Aghbari, and S. Girija, “A systematic survey on multimodal emotion recognition using learning algorithms,†Intelligent Systems with Applications, vol. 17, p. 200171, Feb. 2023, doi: 10.1016/J.ISWA.2022.200171.

M. Sultana, M. Al-Jefri, and J. Lee, “Using machine learning and smartphone and smartwatch data to detect emotional states and transitions: Exploratory study,†JMIR Mhealth Uhealth, vol. 8, no. 9, p. e17818, Sep. 2020, doi: 10.2196/17818.

J. Wang, “Enhancing User Experience Using a Framework Integrating Emotion Recognition and Eye-Tracking,†Highlights in Science, Engineering and Technology, vol. 85, pp. 298–308, Mar. 2024, doi: 10.54097/BE97JG10.

P. Siirtola, S. Tamminen, G. Chandra, A. Ihalapathirana, and J. Röning, “Predicting Emotion with Biosignals: A Comparison of Classification and Regression Models for Estimating Valence and Arousal Level Using Wearable Sensors,†Sensors 2023, Vol. 23, Page 1598, vol. 23, no. 3, p. 1598, Feb. 2023, doi: 10.3390/S23031598.

A. Roshdy, A. Karar, S. Al Kork, T. Beyrouthy, and A. Nait-ali, “Advancements in EEG Emotion Recognition: Leveraging Multi-Modal Database Integration,†Applied Sciences 2024, Vol. 14, Page 2487, vol. 14, no. 6, p. 2487, Mar. 2024, doi: 10.3390/APP14062487.

S. Saganowski, B. Perz, A. G. Polak, and P. Kazienko, “Emotion Recognition for Everyday Life Using Physiological Signals From Wearables: A Systematic Literature Review,†IEEE Trans Affect Comput, vol. 14, no. 3, pp. 1876–1897, Jul. 2023, doi: 10.1109/TAFFC.2022.3176135.

A. Horvers, N. Tombeng, T. Bosse, A. W. Lazonder, and I. Molenaar, “Detecting Emotions through Electrodermal Activity in Learning Contexts: A Systematic Review,†Sensors 2021, Vol. 21, Page 7869, vol. 21, no. 23, p. 7869, Nov. 2021, doi: 10.3390/S21237869.

J. C. Quiroz, E. Geangu, and M. H. Yong, “Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study.,†JMIR Ment Health, vol. 5, no. 3, p. e10153, Aug. 2018, doi: 10.2196/10153.

C. Fei, R. Liu, Z. Li, T. Wang, and F. N. Baig, “Machine and Deep Learning Algorithms for Wearable Health Monitoring,†pp. 105–160, 2021, doi: 10.1007/978-3-030-68723-6_6.

M. Thorat, M. Thorat, S. Pandit, and S. Balote, “Artificial Neural Network: A brief study,†Asian Journal For Convergence In Technology (AJCT) ISSN -2350-1146, vol. 8, no. 3, pp. 12–16, Dec. 2022, doi: 10.33130/AJCT.2022v08i03.003.

J. R. Rausell Campo and D. Perez-Lopez, “Reconfigurable Activation Functions in Integrated Optical Neural Networks,†IEEE Journal of Selected Topics in Quantum Electronics, vol. 28, no. 4, 2022, doi: 10.1109/JSTQE.2022.3169833.

S. Wassan et al., “Deep convolutional neural network and IoT technology for healthcare,†Digit Health, vol. 10, Jan. 2024, doi: 10.1177/20552076231220123/ASSET/IMAGES/LARGE/10.1177_20552076231220123-FIG12.JPEG.

Y. Zhang, J. Liu, and W. Shen, “A Review of Ensemble Learning Algorithms Used in Remote Sensing Applications,†Applied Sciences 2022, Vol. 12, Page 8654, vol. 12, no. 17, p. 8654, Aug. 2022, doi: 10.3390/APP12178654.

A. A. Babasafari, A. Campane Vidal, G. Furlan Chinelatto, J. Rangel, and M. Basso, “Ensemble-based machine learning application for lithofacies classification in a pre-salt carbonate reservoir, Santos Basin, Brazil,†Pet Sci Technol, vol. 42, no. 9, pp. 1138–1154, 2024, doi: 10.1080/10916466.2022.2143813.

J. M. Klusowski and P. M. Tian, “Large Scale Prediction with Decision Trees,†J Am Stat Assoc, vol. 119, no. 545, pp. 525–537, 2024, doi: 10.1080/01621459.2022.2126782.

Z. Khan, N. Gul, N. Faiz, A. Gul, W. Adler, and B. Lausen, “Optimal Trees Selection for Classification via Out-of-Bag Assessment and Sub-Bagging,†IEEE Access, vol. 9, pp. 28591–28607, 2021, doi: 10.1109/ACCESS.2021.3055992.

B. Asadi and R. Hajj, “Prediction of asphalt binder elastic recovery using tree-based ensemble bagging and boosting models,†Constr Build Mater, vol. 410, p. 134154, Jan. 2024, doi: 10.1016/J.CONBUILDMAT.2023.134154.

D. Maulud and A. M. Abdulazeez, “A Review on Linear Regression Comprehensive in Machine Learning,†Journal of Applied Science and Technology Trends, vol. 1, no. 2, pp. 140–147, Dec. 2020, doi: 10.38094/jastt1457.

J. Y. Le Chan et al., “Mitigating the Multicollinearity Problem and Its Machine Learning Approach: A Review,†Mathematics 2022, Vol. 10, Page 1283, vol. 10, no. 8, p. 1283, Apr. 2022, doi: 10.3390/MATH10081283.

R. Olu-Ajayi, H. Alaka, I. Sulaimon, F. Sunmola, and S. Ajayi, “Building energy consumption prediction for residential buildings using deep learning and other machine learning techniques,†Journal of Building Engineering, vol. 45, p. 103406, Jan. 2022, doi: 10.1016/J.JOBE.2021.103406.

Downloads

Published

2024-04-30

Issue

Section

Articles