GC-106 - Emotion Recognition Using Wireless Signals
2021
Abstract
Emotion Recognition plays an important role in understanding human behavior. It finds its utility in various domains such as healthcare, automobile industries, understanding social interactions, fraud detection, and many more. Analyzing a person's emotions in a controlled environment with various devices has been challenging since it adds to human anxiety, which manipulates the readings. This presents a need to devise ways to recognize and study emotions in a wireless manner. We devised a system that recognizes the emotions using Heart Rate Variability (HRV) of the subjects which is estimated from their videos using Remotephotoplethysmography(rPPG). Our emotion recognizer has 93.27% accuracy.
References (9)
- analyzing physiological signals. kaggle, "Kaggle dataset." [Online]. Available: https://www.kaggle.com/qiriro/stress S. Koldijk, M. Sappelli, S. Verberne, M. A. Neerincx, andW. Kraaij, "The swell knowledge work dataset for stress and user modeling research," in Proceedings of the 16th international conference on multimodal interaction, 2014, pp. 291-298.
- R. M. Sabour, Y. Benezeth, F. Marzani, K. Nakamura, R. Gomez, andF. Yang, "Emotional state classification using pulse rate variability," in 2019 IEEE 4th International Conference on Signal and Image Processing (ICSIP). IEEE, 2019, pp. 86-90.
- S. R. Livingstone and F. A. Russo, "The ryerson audio-visual databaseof emotional speech and song (ravdess): A dynamic, multimodal set offacial and vocal expressions in north american english,"PloS one, vol. 13,no. 5, p. e0196391, 2018.
- H.P.DavidHass,SpencerMullinix,"Heartratedetectionusing remote photoplethysmography." [Online]. Available: https://github.com/mullisd1/CVHeartrate
- Huang, J. Yang, P. Liao, and J. Pan, "Fusion of facial expressions andeeg for multimodal emotion recognition,"Computational intelligence andneuroscience, vol. 2017, 2017.
- Y. -L. Hsu, J.-S. Wang, W.-C. Chiang, and C.-H. Hung, "Automatic ecg-based emotion recognition in music listening,"IEEE Transactions onAffective Computing, vol. 11, no. 1, pp. 85-99, 2017.
- K. Jain, P. Shamsolmoali, and P. Sehdev, "Extended deep neuralnetwork for facial emotion recognition,"Pattern Recognition Letters, vol.120, pp. 69-74, 2019
- L. J. Zheng, J. Mountstephens, and J. Teo, "A comparative investigationof eye fixation- based 4-class emotion recognition in virtual reality usingmachine learning," in2021 11th IEEE International Conference onControl System, Computing and Engineering (ICCSCE).IEEE, 2021,pp. 19-22
- We would like to thank my Machine learning Professor, Dr. Mohammed Aledhari for guidance and motivation provided to us time to time. I would like to thank Department of Computer Science and Software Engineering to provide me this opportunity. Email : jmhatre1@students.kennesaw,edu LinkedIn: https://www.linkedin.com/in/jui-mhatre-35600193 Previous work is done to recognize emotions using,