Academia.eduAcademia.edu

Outline

Mini review: Challenges in EEG emotion recognition

2024, Frontiers in Psychology

https://doi.org/10.3389/FPSYG.2023.1289816

Abstract

Electroencephalography (EEG) stands as a pioneering tool at the intersection of neuroscience and technology, o ering unprecedented insights into human emotions. Through this comprehensive review, we explore the challenges and opportunities associated with EEG-based emotion recognition. While recent literature suggests promising high accuracy rates, these claims necessitate critical scrutiny for their authenticity and applicability. The article highlights the significant challenges in generalizing findings from a multitude of EEG devices and data sources, as well as the di culties in data collection. Furthermore, the disparity between controlled laboratory settings and genuine emotional experiences presents a paradox within the paradigm of emotion research. We advocate for a balanced approach, emphasizing the importance of critical evaluation, methodological standardization, and acknowledging the dynamism of emotions for a more holistic understanding of the human emotional landscape.

FAQs

sparkles

AI

What explains the discrepancy in EEG accuracy rates under subject-dependent conditions?add

Research indicates that subject-dependent models achieve accuracy rates as high as 96.89%, compared to lower rates of 74.52% under subject-independent conditions, indicating overfitting to individual data.

How significant is the role of data standardization in EEG emotion research?add

The lack of standardization across EEG devices complicates reproducibility, as architectures vary significantly from low-density to high-density systems, which impacts the generalizability of findings.

When did the field start observing inflated accuracy claims in EEG studies?add

Inflated accuracy claims in EEG emotion recognition have been noted in studies since at least 2022, particularly due to simplified emotional models that do not reflect real-world complexity.

Why do current EEG methodologies struggle with capturing true emotional experiences?add

Current methodologies struggle due to the rigid nature of laboratory settings, which often do not emulate the fluid and dynamic nature of real-life emotional experiences.

What limitations do researchers face when collecting EEG data in practical scenarios?add

Researchers face significant limitations such as the need for specialized expertise and standardized equipment, which often restricts the applicability and generalization of EEG-based emotion recognition.

References (53)

  1. Aria, M., and Cuccurullo, C. (2017). bibliometrix: an r-tool for comprehensive science mapping analysis. J. Informetr. 11, 959-975. doi: 10.1016/j.joi.2017.08.007
  2. Barrett, L. F., Mesquita, B., Ochsner, K. N., and Gross, J. J. (2006). The experience of emotion. Annu. Rev. Psychol. 58, 373-403. doi: 10.1146/annurev.psych.58.110405.085709
  3. Bohil, C. J., Alicea, B., and Biocca, F. A. (2011). Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 12, 752-762. doi: 10.1038/nrn3122
  4. Boudewyn, M. A., Erickson, M. A., Winsler, K., Ragland, J. D., Yonelinas, A., Frank,
  5. M., et al. (2023). Managing eeg studies: how to prepare and what to do once data collection has begun. Psychophysiology 60, e14365. doi: 10.1111/psyp.14365
  6. Brunner-Sperdin, A., Peters, M., and Strobl, A. (2012). It is all about the emotional state: managing tourists' experiences. Int. J. Hosp. Manag. 31, 23-30. doi: 10.1016/j.ijhm.2011.03.004
  7. Chen, J., Hu, B., Moore, P., Zhang, X., and Ma, X. (2015). Electroencephalogram- based emotion assessment system using ontology and data mining techniques. Appl. Soft Comput. 30, 663-674. doi: 10.1016/j.asoc.2015.01.007
  8. Cheng, J., Chen, M., Li, C., Liu, Y., Song, R., Liu, A., et al. (2021). Emotion recognition from multi-channel eeg via deep forest. IEEE J. Biomed. Health Inf. 25, 453-464. doi: 10.1109/JBHI.2020.2995767
  9. Cho, J., and Hwang, H. (2020). Spatio-temporal representation of an electoencephalogram for emotion recognition using a three-dimensional convolutional neural network. Sensors 20, 3491. doi: 10.3390/s20123491
  10. Christensen, L. R., and Abdullah, M. A. (2018). "EEG emotion detection review, " in 2018 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB) (St. Louis, MO), 1-7. doi: 10.1109/CIBCB.2018.8404976
  11. Cohen, S., Zantvoord, J., Wezenberg, B., Daams, J., Bockting, C., Denys, D., et al. (2023). Electroencephalography for predicting antidepressant treatment success: a systematic review and meta-analysis. J. Affect. Disord. 321, 201-207. doi: 10.1016/j.jad.2022.10.042
  12. Cui, H., Liu, A., Zhang, X., Chen, X., Wang, K., and Chen, X. (2020). Eeg-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network. Knowl. Based Syst. 205, 106243. doi: 10.1016/j.knosys.2020.106243
  13. Dadebayev, D., Goh, W. W., and Tan, E. X. (2021). Eeg-based emotion recognition: review of commercial eeg devices and machine learning techniques. J. King Saud Univ. Comput. Inf. Sci. 34, 4385-4401. doi: 10.1016/j.jksuci.2021.03.009
  14. Davidson, R. J. (2010). Affective style and affective disorders: perspectives from affective neuroscience. APA PsycNet 12, 307-330. doi: 10.1080/026999398
  15. Fatourechi, M., Mason, S. G., Birch, G. E., and Ward, R. K. (2004). "A wavelet- based approach for the extraction of event related potentials from eeg, " in 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing (Montreal, QC: IEEE), ii-737. doi: 10.1109/ICASSP.2004.1326363
  16. Fdez, J., Guttenberg, N., Witkowski, O., and Pasquali, A. (2021). Cross-subject eeg-based emotion recognition through neural networks with stratified normalization. Front. Neurosci. 15, 626277. doi: 10.3389/fnins.2021.626277
  17. He, H., Tan, Y., Ying, J., and Zhang, W. (2020). Strengthen eeg-based emotion recognition using firefly integrated optimization algorithm. Appl. Soft Comput. 94, 106426. doi: 10.1016/j.asoc.2020.106426
  18. Healey, J., Nachman, L., Subramanian, S., Shahabdeen, J., and Morris, M. (2010). "Out of the lab and into the fray: towards modeling emotion in everyday life, " in Pervasive Computing. Pervasive 2010. Lecture Notes in Computer Science, Vol. 6030, eds P. Floréen, A. Krüger, and M. Spasojevic (Berlin; Heidelberg: Springer). doi: 10.1007/978-3-642-12654-3_10
  19. Hernandez-Pavon, J. C., Veniero, D., Bergmann, T. O., Belardinelli, P., Bortoletto, M., Casarotto, S., et al. (2023). Tms combined with eeg: Recommendations and open issues for data collection and analysis. Brain Stimul. 16, 567-593. doi: 10.1016/j.brs.2023.02.009
  20. Hong, Q. N., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., et al. (2018). The mixed methods appraisal tool (mmat) version 2018 for information professionals and researchers. Educ. Inf. 34, 285-291. doi: 10.3233/EFI-180221
  21. Huang, D., Chen, S., Liu, C., Zheng, L., Tian, Z., and Jiang, D. (2021). Differences first in asymmetric brain: a bi-hemisphere discrepancy convolutional neural network for eeg emotion recognition. Neurocomputing 448, 140-151. doi: 10.1016/j.neucom.2021.03.105
  22. Jani, D., and Han, H. (2015). Influence of environmental stimuli on hotel customer emotional loyalty response: Testing the moderating effect of the big five personality factors. Int. J. Hosp. Manag. 44, 48-57. doi: 10.1016/j.ijhm.2014.10.006
  23. Katsigiannis, S., and Ramzan, N. (2017). Dreamer: a database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inf. 22, 98-107. doi: 10.1109/JBHI.2017.2688239
  24. Keil, A., Debener, S., Gratton, G., Junghöfer, M., Kappenman, E. S., Luck, S. J., et al. (2014). Committee report: Publication guidelines and recommendations for studies using electroencephalography and magnetoencephalography. Psychophysiology 51, 1-21. doi: 10.1111/psyp.12147
  25. Koelstra, S., Muhl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., et al. (2011). Deap: a database for emotion analysis; using physiological signals. IEEE Transact. Affect. Comp. 3, 18-31. doi: 10.1109/T-AFFC.2011.15
  26. Kumari, N., Anwar, S., and Bhattacharjee, V. (2022). Time series- dependent feature of eeg signals for improved visually evoked emotion classification using emotioncapsnet. Neural Comp. Appl. 34, 13291-13303. doi: 10.1007/s00521-022-06942-x
  27. Kuppens, P., Oravecz, Z., and Tuerlinckx, F. (2010). Feelings change: accounting for individual differences in the temporal dynamics of affect. J. Pers. Soc. Psychol. 99, 1042-1060. doi: 10.1037/a0020962
  28. Lehman, B. J., Cane, A. C., Tallon, S. J., and Smith, S. F. (2015). Physiological and emotional responses to subjective social evaluative threat in daily life. Anxiety Stress Coping 28, 321-339. doi: 10.1080/10615806.2014.968563
  29. Li, Y., Zheng, W., Cui, Z., Zhang, T., and Zong, Y. (2018). "A novel neural network model based on cerebral hemispheric asymmetry for eeg emotion recognition, " in Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI-18 (Stockholm: International Joint Conferences on Artificial Intelligence Organization), 1561-1567. doi: 10.24963/ijcai.2018/216
  30. Li, Y., Zheng, W., Wang, L., Zong, Y., and Cui, Z. (2022). From regional to global brain: a novel hierarchical spatial-temporal neural network model for eeg emotion recognition. IEEE Transact. Affect. Comp. 13, 568-578. doi: 10.1109/TAFFC.2019.2922912
  31. Liu, Y., Ding, Y., Li, C., Cheng, J., Song, R., Wan, F., et al. (2020). Multi-channel eeg- based emotion recognition via a multi-level features guided capsule network. Comput. Biol. Med. 123, 103927. doi: 10.1016/j.compbiomed.2020.103927
  32. Luo, Y., Zhu, L. Z., Wan, Z. Y., and Lu, B. L. (2020). Data augmentation for enhancing eeg-based emotion recognition with deep generative models. J. Neural Eng. 17, 056021. doi: 10.1088/1741-2552/abb580
  33. Ma, J., Tang, H., Zheng, W.-L., and Lu, B.-L. (2019). "Emotion recognition using multimodal residual LSTM network, " in Proceedings of the 27th ACM International Conference on Multimedia (MM '19) (New York, NY: Association for Computing Machinery), 176-183. doi: 10.1145/3343031.3350871
  34. MacNamara, A., Joyner, K., and Klawohn, J. (2022). Event-related potential studies of emotion regulation: a review of recent progress and future directions. Int. J. Psychophysiol. 176, 73-88. doi: 10.1016/j.ijpsycho.2022.03.008
  35. Naga, P., Marri, S. D., and Borreo, R. (2023). Facial emotion recognition methods, datasets and technologies: a literature Materials Today 80, 2824-2828. doi: 10.1016/j.matpr.2021.07.046
  36. Nummenmaa, L., Glerean, E., Viinikainen, M., Jääskeläinen, I. P., Hari, R., and Sams, M. (2012). Emotions promote social interaction by synchronizing brain activity across individuals. Proc. Nat. Acad. Sci. U. S. A. 109, 9599-9604. doi: 10.1073/pnas.1206095109
  37. Rottenberg, J., and Gross, J. J. (2007). Emotion and emotion regulation: a map for psychotherapy researchers. Clin. Psychol.: Sci. Pract. 14, 323-328. doi: 10.1111/j.1468-2850.2007.00093.x
  38. Russell, J. A. (1994). Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychol. Bull. 115, 102-141. doi: 10.1037/0033-2909.115.1.102
  39. Russell, J. A. (2003). Core affect and the psychological construction of emotion. Psychol. Rev. 110, 145-172. doi: 10.1037/0033-295X.110.1.145
  40. Song, T., Zheng, W., Lu, C., Zong, Y., Zhang, X., and Cui, Z. (2019). Mped: a multi- modal physiological emotion database for discrete emotion recognition. IEEE Access 7, 12177-12191. doi: 10.1109/ACCESS.2019.2891579
  41. Suhaimi, N. S., Mountstephens, J., and Teo, J. (2020). Eeg-based emotion recognition: a state-of-the-art review of current trends and opportunities. Comput. Intell. Neurosci. 2020, 8875426. doi: 10.1155/2020/8875426
  42. Tao, W., Li, C., Song, R., Cheng, J., Liu, Y., Wan, F., et al. (2023). Eeg-based emotion recognition via channel-wise attention and self attention. IEEE Transact. Affect. Comp. 14, 382-393. doi: 10.1109/TAFFC.2020.3025777
  43. Wang, F., Wu, S., Zhang, W., Xu, Z., Zhang, Y., Wu, C., et al. (2020). Emotion recognition with convolutional neural network and eeg-based efdms. Neuropsychologia 146, 107506. doi: 10.1016/j.neuropsychologia.2020.107506
  44. Wilhelm, F. H., and Grossman, P. (2010). Emotions beyond the laboratory: theoretical fundaments, study design, and analytic strategies for advanced ambulatory assessment. Biol. Psychol. 84, 552-569. doi: 10.1016/j.biopsycho.2010.
  45. Yang, Y., Wu, Q., Qiu, M., Wang, Y., and Chen, X. (2018). "Emotion recognition from multi-channel eeg through parallel convolutional recurrent neural network, " in 2018 International Joint Conference on Neural Networks (IJCNN) (Rio de Janeiro: IEEE). doi: 10.1109/IJCNN.2018.8489331
  46. Yin, Y., Zheng, X., Hu, B., Zhang, Y., and Cui, X. (2021). Eeg emotion recognition using fusion model of graph convolutional neural networks and lstm. Appl. Soft Comput. 100, 106954. doi: 10.1016/j.asoc.2020.106954
  47. Zhang, G., and Etemad, A. (2023). Distilling eeg representations via capsules for affective computing. Pattern Recognit. Lett. 171, 99-105. doi: 10.1016/j.patrec.2023.05.011
  48. Zhang, T., Cui, Z., Xu, C., Zheng, W., and Yang, J. (2020). Variational pathway reasoning for eeg emotion recognition. Proc. AAAI Conf. Artif. Intell. 34, 2709-2716. doi: 10.1609/aaai.v34i03.5657
  49. Zhang, T., Zheng, W., Cui, Z., Zong, Y., and Li, Y. (2019). Spatial-temporal recurrent neural network for emotion recognition. IEEE Trans. Cybern. 49, 939-947. doi: 10.1109/TCYB.2017.2788081
  50. Zhang, Z., Mir, J. M. F., and Mateu, L. G. (2022). The effects of white versus coloured light in waiting rooms on people's emotions. Buildings 12, 1356. doi: 10.3390/buildings12091356
  51. Zheng, W.-L., and Lu, B.-L. (2015). Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 7, 162-175. doi: 10.1109/TAMD.2015.2431497
  52. Zhong, P., Wang, D., and Miao, C. (2019). Eeg-based emotion recognition using regularized graph neural networks. IEEE Transact. Affect. Comp. 13, 1290-1301. doi: 10.1109/TAFFC.2020.2994159
  53. Zhu, X., Rong, W., Zhao, L., He, Z., Yang, Q., Sun, J., et al. (2022). Eeg emotion classification network based on attention fusion of multi-channel band features. Sensors 22, 5252. doi: 10.3390/s22145252