Academia.eduAcademia.edu

Outline

eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset

Brain Sciences

https://doi.org/10.3390/BRAINSCI13040589

Abstract

Affective state estimation is a research field that has gained increased attention from the research community in the last decade. Two of the main catalysts for this are the advancement in the data analysis using artificial intelligence and the availability of high-quality video. Unfortunately, benchmarks and public datasets are limited, thus making the development of new methodologies and the implementation of comparative studies essential. The current work presents the eSEE-d database, which is a resource to be used for emotional State Estimation based on Eye-tracking data. Eye movements of 48 participants were recorded as they watched 10 emotion-evoking videos, each of them followed by a neutral video. Participants rated four emotions (tenderness, anger, disgust, sadness) on a scale from 0 to 10, which was later translated in terms of emotional arousal and valence levels. Furthermore, each participant filled three self-assessment questionnaires. An extensive analysis of the parti...

References (76)

  1. Ekman, P.; Friesen, W.V. Constants across cultures in the face and emotion. J. Personal. Soc. Psychol. 1971, 17, 124-129. [CrossRef] [PubMed]
  2. Ekman, P.; Friesen, W.V. Facial Action Coding System: A Technique for the Measurement of Facial Movement; Consulting Psychologists Press: Palo Alto, CA, USA, 1978.
  3. Toisoul, A.; Kossaifi, J.; Bulat, A.; Tzimiropoulos, G.; Pantic, M. Estimation of continuous valence and arousal levels from faces in naturalistic conditions. Nat. Mach. Intell. 2021, 3, 42-50. [CrossRef]
  4. Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161-1178. [CrossRef]
  5. Aracena, C.; Basterrech, S.; Snáel, V.; Velásquez, J.D. Neural Networks for Emotion Recognition Based on Eye Tracking Data. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, Hong Kong, China, 9-12 October 2015; pp. 2632-2637.
  6. Di Luzio, F.; Rosato, A.; Panella, M. A randomized deep neural network for emotion recognition with landmarks detection. Biomed. Signal Process. Control. 2023, 81, 104418. [CrossRef]
  7. Zhai, J.; Barreto, A.; Chin, C.; Li, C. Realization of stress detection using psychophysiological signals for improvement of human-computer interactions. In Proceedings of the IEEE SoutheastCon, 2005, Ft. Lauderdale, FL, USA, 8-10 April 2005; pp. 415-420. [CrossRef]
  8. Lin, Z.; Liu, Y.; Wang, H.; Liu, Z.; Cai, S.; Zheng, Z.; Zhou, Y.; Zhang, X. An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomed. Signal Process. Control 2022, 74, 103521. [CrossRef]
  9. JothiPrabha, A.; Bhargavi, R.; Deepa Rani, B.V. Prediction of dyslexia severity levels from fixation and saccadic eye movement using machine learning. Biomed. Signal Process. Control 2023, 79, 104094. [CrossRef]
  10. Ktistakis, E.; Skaramagkas, V.; Manousos, D.; Tachos, N.S.; Tripoliti, E.; Fotiadis, D.I.; Tsiknakis, M. COLET: A dataset for COgnitive workLoad estimation based on eye-tracking. Comput. Methods Programs Biomed. 2022, 224, 106989. [CrossRef]
  11. Alghowinem, S.; Goecke, R.; Wagner, M.; Parker, G.; Breakspear, M. Eye movement analysis for depression detection. In Proceedings of the 2013 IEEE International Conference on Image Processing, Melbourne, VIC, Australia, 15-18 September 2013; pp. 4220-4224. [CrossRef]
  12. Al-gawwam, S.; Benaissa, M. Depression Detection From Eye Blink Features. In Proceedings of the 2018 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Louisville, KY, USA, 6-8 December 2018; pp. 388-392. [CrossRef]
  13. Zheng, L.; Mountstephens, J.; Teo, J. Multiclass Emotion Classification Using Pupil Size in VR: Tuning Support Vector Machines to Improve Performance. J. Physics Conf. Ser. 2020, 1529. [CrossRef]
  14. Tarnowski, P.; Kołodziej, M.; Majkowski, A.; Rak, R.J. Eye-Tracking Analysis for Emotion Recognition. Comput. Intell. Neurosci. 2020, 2020, 2909267. [CrossRef]
  15. Zheng, W.L.; Dong, B.N.; Lu, B.L. Multimodal emotion recognition using EEG and eye tracking data. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2014, Chicago, IL, USA, 26-30 August 2014; pp. 5040-5043. [CrossRef]
  16. Skaramagkas, V.; Ktistakis, E.; Manousos, D.; Tachos, N.S.; Kazantzaki, E.; Tripoliti, E.E.; Fotiadis, D.I.; Tsiknakis, M. A machine learning approach to predict emotional arousal and valence from gaze extracted features. In Proceedings of the 2021 IEEE 21st International Conference on Bioinformatics and Bioengineering (BIBE), Kragujevac, Serbia, 25-27 October 2021; pp. 1-5. [CrossRef]
  17. Raudonis, V.; Dervinis, G.; Vilkauskas, A.; Paulauskaite, A.; Kersulyte, G. Evaluation of Human Emotion from Eye Motions. Int. J. Adv. Comput. Sci. Appl. 2013, 4, 79-84. [CrossRef]
  18. Skaramagkas, V.; Giannakakis, G.; Ktistakis, E.; Manousos, D.; Karatzanis, I.; Tachos, N.; Tripoliti, E.E.; Marias, K.; Fotiadis, D.I.; Tsiknakis, M. Review of eye tracking metrics involved in emotional and cognitive processes. IEEE Rev. Biomed. Eng. 2021, 16, 260-277.
  19. Kimble, M.O.; Fleming, K.; Bandy, C.; Kim, J.; Zambetti, A. Eye tracking and visual attention to threating stimuli in veterans of the Iraq war. J. Anxiety Disord. 2010, 24, 293-299. [CrossRef] [PubMed]
  20. Quigley, L.; Nelson, A.L.; Carriere, J.; Smilek, D.; Purdon, C. The effects of trait and state anxiety on attention to emotional images: an eye-tracking study. Cogn. Emot. 2012, 26, 1390-1411. [CrossRef] [PubMed]
  21. Holmqvist, K.; Nystrom, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; Weijer, J.V.D. Eye Tracking: A Comprehensive Guide to Methods and Measures; Reprint Edition; Oxford University Press: Oxford, UK, 2015; ISBN 9780198738596.
  22. Mollahosseini, A.; Hasani, B.; Mahoor, M.H. AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Trans. Affect. Comput. 2019, 10, 18-31. [CrossRef]
  23. Giannopoulos, P.; Perikos, I.; Hatzilygeroudis, I., Deep Learning Approaches for Facial Emotion Recognition: A Case Study on FER-2013. In Advances in Hybridization of Intelligent Methods: Models, Systems and Applications; Hatzilygeroudis, I., Palade, V., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 1-16.
  24. Kosti, R.; Alvarez, J.M.; Recasens, A.; Lapedriza, A. EMOTIC: Emotions in Context Dataset. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21-26 July 2017; pp. 2309-2317.
  25. Xue, J.; Wang, J.; Hu, S.; Bi, N.; Lv, Z. Ovpd: Odor-video elicited physiological signal database for emotion recognition. IEEE Trans. Instrum. Meas. 2022, 71, 1-12. [CrossRef]
  26. Wu, M.; Teng, W.; Fan, C.; Pei, S.; Li, P.; Lv, Z. An investigation of olfactory-enhanced video on eeg-based emotion recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 1602-1613. [CrossRef]
  27. Zlatintsi, A.; Koutras, P.; Evangelopoulos, G.; Malandrakis, N.; Efthymiou, N.; Pastra, K.; Potamianos, A.; Maragos, P. COGN- IMUSE: a multimodal video database annotated with saliency, events, semantics and emotion with application to summarization. EURASIP J. Image Video Process. 2017, 2017. [CrossRef]
  28. Fan, S.; Shen, Z.; Jiang, M.; Koenig, B.L.; Xu, J.; Kankanhalli, M.S.; Zhao, Q. Emotional Attention: A Study of Image Sentiment and Visual Attention. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18-23 June 2018; pp. 7521-7531. [CrossRef]
  29. Ramanathan, S.; Katti, H.; Sebe, N.; Kankanhalli, M.; Chua, T.-S. An Eye Fixation Database for Saliency Detection in Images. In Proceedings of the Computer Vision-ECCV 2010; Daniilidis, K., Maragos, P., Paragios, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 30-43.
  30. Borji, A.; Itti, L. CAT2000: A Large Scale Fixation Dataset for Boosting Saliency Research. arXiv 2015, arXiv:1505.03581.
  31. Katsigiannis, S.; Ramzan, N. Dreamer: A database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 2018, 22, 98-107. [CrossRef]
  32. Song, T.; Zheng, W.; Lu, C.; Zong, Y.; Zhang, X.; Cui, Z. Mped: A multi-modal physiological emotion database for discrete emotion recognition. IEEE Access 2019, 7, 12177-12191. [CrossRef]
  33. Schaefer, A.; Nils, F.; Sanchez, X.; Philippot, P. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cogn. Emot. Cogn. Emot. 2010, 24, 1153-1172. [CrossRef]
  34. Johnson-Laird, P.N.; Oatley, K. The language of emotions: An analysis of a semantic field. Cogn. Emot. 1989, 3, 81-123. [CrossRef]
  35. Ekman, P. An Argument for Basic Emotions. Cogn. Emot. 1992, 6, 169-200. [CrossRef]
  36. Quinto, L.; Thompson, W.F. Composers and performers have different capacities to manipulate arousal and valence. Psychomusicol. Music. Mind Brain 2013, 23, 137-150. [CrossRef]
  37. Zhao, G.; Zhang, Y.; Ge, Y. Frontal EEG asymmetry and middle line power difference in discrete emotions. Front. Behav. Neurosci. 2018, 12, 225. [CrossRef] [PubMed]
  38. Martínez-Rodrigo, A.; Fernandez-Aguilar, L.; Zangroniz, R.; Latorre, J.M.; Pastor, J.M.; Fernandez-Caballero, A. Film mood induction and emotion classification using physiological signals for health and wellness promotion in older adults living alone. Expert Syst. 2020, 37. [CrossRef]
  39. Gross, J.J.; Levenson, R.W. Emotion elicitation using films. Cogn. Emot. 1995, 9, 87-108. [CrossRef]
  40. Radloff, L.S. The CES-D Scale: A Self-Report Depression Scale for Research in the General Population. Appl. Psychol. Meas. 1977, 1, 385-401. [CrossRef]
  41. Fountoulakis, K.; Iacovides, A.; Kleanthous, S.; Samolis, S.; Kaprinis, S.G.; Sitzoglou, K.; Kaprinis, G.S.; Bech, P. Reliability, Validity and Psychometric Properties of the Greek Translation of the Center for Epidemiological Studies-Depression (CES-D) Scale. BMC Psychiatry 2001, 1, 3. [CrossRef]
  42. Spielberger, C.D.; Gorsuch, R.L.; Lushene, R.; Vagg, P.R.; Jacobs, G.A. Manual for the State-Trait Anxiety Inventory; Consulting Psychologists Press: Palo Alto, CA, USA, 1983.
  43. Fountoulakis, K.N.; Papadopoulou, M.; Kleanthous, S.; Papadopoulou, A.; Bizeli, V.; Nimatoudis, I.; Iacovides, A.; Kaprinis, G.S. Reliability and psychometric properties of the Greek translation of the State-Trait Anxiety Inventory form Y: Preliminary data. Ann. Gen. Psychiatry 2006, 5, 2. [CrossRef]
  44. Jolliffe, D.; Farrington, D.P. Development and validation of the Basic Empathy Scale. J. Adolesc. 2006, 29, 589-611. [CrossRef] [PubMed]
  45. Stavrinides, P.; Georgiou, S.; Theofanous, V. Bullying and empathy: A short-term longitudinal investigation. Educ. Psychol. 2010, 30, 793-802. [CrossRef]
  46. Kassner, M.; Patera, W.; Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-Based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13-17 September 2014; pp. 1151-1160. [CrossRef]
  47. Plainis, S.; Orphanos, Y.; Tsilimbaris, M.K. A Modified ETDRS Visual Acuity Chart for European-Wide Use. Optom Vis Sci. 2007, 84, 647-653. [CrossRef] [PubMed]
  48. How Is the Sample Percentage Calculated in Studio, Lab and Controller? Available online: https://connect.tobii.com/s/article/ Sample-percentage-calculated-in-Studio-Lab-and-Controller? (accessed on 28 March 2023).
  49. Hollander, J.; Huette, S. Extracting blinks from continuous eye-tracking data in a mind wandering paradigm. Conscious. Cogn. 2022, 100, 103303. [CrossRef]
  50. Faber, M.; Bixler, R.; D'Mello, S.K. An automated behavioral measure of mind wandering during computerized reading. Behav. Res. Methods 2018, 50, 134-150. [CrossRef]
  51. Killingsworth, M.A.; Gilbert, D.T. A wandering mind is an unhappy mind. Science 2010, 330, 932. [CrossRef]
  52. Chartier, S.; Renaud, P. An Online Noise Filter for Eye-Tracker Data Recorded in a Virtual Environment. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications; Association for Computing Machinery: New York, NY, USA, 26 March 2008; pp. 153-156.
  53. Duchowski, A.T. Eye Movement Analysis. In Eye Tracking Methodology: Theory and Practice; Springer London: London, UK, 2003; pp. 111-128.
  54. Salvucci, D.D.; Goldberg, J.H. Identifying Fixations and Saccades in Eye-Tracking Protocols. In Proceedings of the 2000 Sympo- sium on Eye Tracking Research & Applications; Association for Computing Machinery: New York, NY, USA, 8 November 2000; pp. 71-78.
  55. Andersson, R.; Larsson, L.; Holmqvist, K.; Stridh, M.; Nyström, M. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behav. Res. Methods 2016, 49, 616-637. . [CrossRef]
  56. Zaidawi, S.M.K.A.; Prinzler, M.H.U.; Lührs, J.; Maneth, S. An Extensive Study of User Identification via Eye Movements across Multiple Datasets. arXiv 2021, arXiv:2111.05901.
  57. Ellis, C.J. The pupillary light reflex in normal subjects. Br. J. Ophthalmol. 1981, 65, 754-759. [CrossRef]
  58. Atchison, D.A.; Girgenti, C.C.; Campbell, G.M.; Dodds, J.P.; Byrnes, T.M.; Zele, A.J. Influence of field size on pupil diameter under photopic and mesopic light levels. Clin. Exp. Optom. 2011, 94, 545-548. [CrossRef]
  59. Watson, A.B.; Yellott, J.I. A unified formula for light-adapted pupil size. J. Vis. 2012, 12, 12. [CrossRef] [PubMed]
  60. Sun, X.; Hong, T.; Li, C.; Ren, F. Hybrid spatiotemporal models for sentiment classification via galvanic skin response. Neurocomputing 2019, 358, 385-400. [CrossRef]
  61. Meghanathan, N. Assortativity Analysis of Real-World Network Graphs based on Centrality Metrics. Comput. Inf. Sci. 2016, 9, 7. [CrossRef]
  62. Philippot, P. Inducing and assessing differentiated emotion-feeling states in the laboratory. Cogn. Emot. 1993, 7, 171-193.
  63. Singmann, H.; Kellen, D. An Introduction to Mixed Models for Experimental Psychology. In New Methods in Cognitive Psychology; Routledge: Oxfordshire, UK, 2019; p. 28, ISBN 9780429318405.
  64. Chen, Z.; Pang, M.; Zhao, Z.; Li, S.; Miao, R.; Zhang, Y.; Feng, X.; Feng, X.; Zhang, Y.; Duan, M.; et al. Feature selection may improve deep neural networks for the bioinformatics problems. Bioinformatics 2019, 36, 1542-1552. [CrossRef] [PubMed]
  65. Gholamy, A.; Kreinovich, V.; Kosheleva, O. Why 70/30 or 80/20 Relation between Training and Testing Sets: A Pedagogical Explanation; ScholarWorks: Allendale Charter Twp, MI, USA, 2018.
  66. Nair, V.; Hinton, G.E. Rectified Linear Units Improve Restricted Boltzmann Machines. In Proceedings of the 27th International Conference on International Conference on Machine Learning; Omnipress: Madison, WI, USA, 21 June 2010; pp. 807-814.
  67. Narayan, S. The generalized sigmoid activation function: Competitive supervised learning. Inf. Sci. 1997, 99, 69-82. [CrossRef]
  68. Bouma, H. Size of the Static Pupil as a Function of Wave-length and Luminosity of the Light Incident on the Human Eye. Nature 1962, 193, 690-691. [CrossRef]
  69. Carle, C.F.; James, A.C.; Maddess, T. The Pupillary Response to Color and Luminance Variant Multifocal Stimuli. Investig. Ophthalmol. Vis. Sci. 2013, 54, 467-475. [CrossRef]
  70. Davidson, R.J. Affective Style and Affective Disorders: Perspectives from Affective Neuroscience. Cogn. Emot. 1998, 12, 307-330.
  71. Verduyn, P.; Delaveau, P.; Rotge, J.Y.; Fossati, P.; Mechelen, I.V. Determinants of Emotion Duration and Underlying Psychological and Neural Mechanisms. Emot. Rev. 2015, 7, 330-335. [CrossRef]
  72. Mechelen, I.V.; Verduyn, P.; Brans, K. The Duration of Emotional Episodes. In Changing Emotions; Psychology Press: London, UK, 2013; ISBN 9780203075630.
  73. Frijda, N.H. The laws of emotion. Am. Psychol. 1988, 43 , 349-358. [CrossRef] [PubMed]
  74. Verduyn, P.; van Mechelen, I.; Tuerlinckx, F. The relation between event processing and the duration of emotional experience. Emotion 2011, 11, 20-28. [CrossRef] [PubMed]
  75. Sbarra, D.A. Predicting the onset of emotional recovery following nonmarital relationship dissolution: Survival analyses of sadness and anger. Personal. Soc. Psychol. Bull. 2006, 32, 298-312. [CrossRef] [PubMed]
  76. Disclaimer/Publisher's Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.