Academia.eduAcademia.edu

Outline

Predicting the Valence of a Scene from Observers’ Eye Movements

2015, PLOS ONE

https://doi.org/10.1371/JOURNAL.PONE.0138198

Abstract

Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that 'saliency map', 'fixation histogram', 'histogram of fixation duration', and 'histogram of saccade slope' are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images.

References (46)

  1. Yarbus AL. Eye Movements and Vision. Plenum Press; 1967.
  2. Henderson JM, Shinkareva SV, Wang J, Luke SG, Olejarczyk J. Predicting Cognitive State from Eye Movements. PLoS ONE. 2013; 8(5). doi: 10.1371/journal.pone.0064937
  3. Borji A, Itti L. Defending Yarbus: Eye Movements reveal observers' task. Journal of Vision. 2014; 14(5).
  4. Bulling A, Ward JA, Gellersen H, Troster G. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2011; 33(4). doi: 10.1109/TPAMI.2010.86 PMID: 20421675
  5. Subramanian R, Yanulevskaya V, Sebe N. Can computers learn from humans to see better?: inferring scene semantics from viewers' eye movements. In: ACM MM; 2011. p. 33-42.
  6. Isola P, Xiao J, Torralba A, Oliva A. What makes an image memorable? In: CVPR; 2011. p. 145-152.
  7. Mancas M, Meur OL. Memorability of Natural Scenes: The role of Attention. In: ICIP; 2013. p. 196-200.
  8. Le Meur O, Baccino T, Roumy A. Prediction of the Inter-observer Visual Congruency (IOVC) and Appli- cation to Image Ranking. In: ACM MM; 2011. p. 373-382.
  9. Nummenmaa L, Hyönä J, Calvo MG. Eye movement assessment of selective attentional capture by emotional pictures. Emotion. 2006; 6(2). doi: 10.1037/1528-3542.6.2.257 PMID: 16768558
  10. Humphrey K, Underwood G, Lambert T. Salience of the lambs: A test of the saliency map hypothesis with pictures of emotive objects. Journal of Vision. 2012; 12(1). doi: 10.1167/12.1.22 PMID: 22279240
  11. Niu Y, Todd RM, Kyan M, Anderson AK. Visual and emotional salience influence eye movements. ACM Transactions on Applied Perception. 2012; 9(3). doi: 10.1145/2325722.2325726
  12. R Tavakoli H, Yanulevskayay V, Rahtu E, Heikkila J, Sebe N. Emotional Valence Recognition, Analysis of Salience and Eye Movements. In: ICPR; 2014. p. 4666-4671.
  13. Lang PJ, Bradley MM, Cuthbert BN.International affective picture system (IAPS): Affective ratings of pictures and instruction manual. University of Florida, Gainesville, FL; 2008. A-8. Available from: http:// csea.phhp.ufl.edu/media.html.
  14. Ramanathan S, Katti H, Sebe N, Kankanhalli M, Chua TS. An eye fixation database for saliency detec- tion in images. In: ECCV. vol. 6314 of LNCS; 2010. p. 30-43. Available from: http://mmas.comp.nus. edu.sg/NUSEF.html.
  15. Borji A, Tavakoli HR, Sihite DN, Itti L. Analysis of Scores, Datasets, and Models in Visual Saliency Pre- diction. In: ICCV; 2013. p. 921-928.
  16. Kootstra G, Nederveen A, de Boer B. Paying attention to symmetry. In: BMVC; 2008. p. 1115-1125.
  17. Bradley MM, Codispoti M, Sabatinelli D, Lang P. Emotion and motivation II: sex differences in picture processing. Emotion. 2001; 1(3).
  18. Lithari C, Frantzidis CA, Papadelis C, Vivas A, Klados MA, Kourtidou-Papadeli C, et al. Are Females More Responsive to Emotional Stimuli? A Neurophysiological Study Across Arousal and Valence Dimensions. Brain Topography. 2010; 23(1). doi: 10.1007/s10548-009-0130-5 PMID: 20043199
  19. Ma KT, Sim T, Kankanhalli M. VIP: A Unifying Framework for Computational Eye-Gaze Research. In: Salah A, Hung H, Aran O, Gunes H, editors. Human Behavior Understanding. vol. 8212 of LNCS. Springer International Publishing; 2013. p. 209-222.
  20. Wadlinger HA, Isaacowitz DM. Positive mood broadens visual attention to positive stimuli. Motivation and Emotion. 2006; 30. doi: 10.1007/s11031-006-9021-1 PMID: 20431711
  21. van Steenbergen H, Band GPH, Hommel B. Threat But Not Arousal Narrows Attention: Evidence from Pupil Dilation and Saccade Control. Frontiers in Psychology. 2011; 2. doi: 10.3389/fpsyg.2011.00281 PMID: 22059081
  22. Just MA, Carpenter PA. A theory of reading: From eye fixations to comprehension. Psychological Review. 1980; 87. doi: 10.1037/0033-295X.87.4.329 PMID: 7413885
  23. Vitu F, McConkie GW, Kerr P, O'Regan JK. Fixation location effects on fixation durations during read- ing: an inverted optimal viewing position effect. Vision Research. 41;2001. doi: 10.1016/S0042-6989 (01)00166-3 PMID: 11718792
  24. Tichon JG, Mavin T, Wallis G, Visser TAW, Riek S. Using Pupillometry and Electromyography to Track Positive and Negative Affect During Flight Simulation. Aviation Psychology and Applied Human Fac- tors. 2014; 4(1). doi: 10.1027/2192-0923/a000052
  25. Simola J, Fevre KL, Torniainen J, Baccino T. Affective processing in natural scene viewing: Valence and arousal interactions in eye-fixation-related potentials. NeuroImage. 2015; 106(0). PMID: 25463473
  26. Susskind JM, Lee DH, Cusi A, Feiman R, Grabski W, Anderson AK. Expressing fear enhances sensory acquisition. Nature Neuroscience. 2008; 11(7). doi: 10.1038/nn.2138 PMID: 18552843
  27. Chen NM, Clarke PF, Watson TL, MacLeod C, Guastella AJ. Biased Saccadic Responses to Emotional Stimuli in Anxiety: An Antisaccade Study. Plos ONE. 2014; 9(2). doi: 10.1371/journal.pone.0086474
  28. Armstrong T, Olatunji BO. What they see is what you get: Eye tracking of attention in the anxiety disor- ders; 2009. Available from: http://www.apa.org/science/about/psa/2009/03/science-briefs.aspx.
  29. Mikels J, Fredrickson B, Larkin G, Lindberg C, Maglio S, Reuter-Lorenz P. Emotional category data on images from the International Affective Picture System. Behavior Research Methods. 2005; 37(4). doi: 10.3758/BF03192732 PMID: 16629294
  30. Irwin DE. Fixation Location and Fixation Duration as Indices of Cognitive Processing. In: Ferreira F, Henderson JM, editors. The interface of language, vision, and action: Eye movements and the visual world. Psychology Press; 2004. p. 105-133.
  31. Kanan C, Ray N, Bseiso DNF, Hsiao JH, W CG. Predicting an observer's task using multi-fixation pat- tern analysis. In: ACM Symposium on Eye Tracking Research and Applications; 2014. p. 287-290.
  32. Greene MR, Liu T, Wolfe JM. Reconsidering Yarbus: A failure to predict observers' task from eye move- ment patterns. Vision Research. 2012; 62(0). doi: 10.1016/j.visres.2012.03.019 PMID: 22487718
  33. Judd T, Ehinger K, Durand F, Torralba A. Learning to Predict Where Humans Look. In: ICCV; 2009. p. 2106-2113.
  34. Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1998; 20(11). doi: 10.1109/34.730558
  35. Parkhurst D, Law K, Niebur E. Modeling the role of salience in the allocation of overt visual attention. Vision Research. 2002; 42(1). doi: 10.1016/S0042-6989(01)00250-4 PMID: 11804636
  36. Chen YW, Lin CJ. Combining SVMs with Various Feature Selection Strategies. In: Guyon I, Nikravesh M, Gunn S, Zadeh L, editors. Feature Extraction. vol. 207 of Studies in Fuzziness and Soft Computing. Springer Berlin Heidelberg; 2006. p. 315-324.
  37. Marcano-Cedeno A, Quintanilla-dominguez J, Cortina-Januchs MG, Andina D. Feature selection using Sequential Forward Selection and classification applying Artificial Metaplasticity Neural Network. In: IECON; 2010. p. 2845-2850.
  38. Wahab MNA, Nefti-Meziani S, Atyabi A. A Comprehensive Review of Swarm Optimization Algorithms. PLoS ONE. 2015; 10(5). doi: 10.1371/journal.pone.0122827
  39. Wang L. Support Vector Machines: Theory and Applications. Springer-Verlag Berlin Heidelberg; 2005.
  40. Sanei S, Chambers JA. EEG Signal Processing. John Wiley & Sons, Ltd.; 2007.
  41. Olson DL, Delen D. Advanced Data Mining Techniques. Springer; 2002.
  42. Anderson T. Classification by multivariate analysis. Psychometrika. 1951; 16. doi: 10.1007/ BF02313425
  43. Shahrokh Esfahani M, Dougherty E. Effect of separate sampling on classification accuracy. Bioinfor- matics. 2014; 30(2). doi: 10.1093/bioinformatics/btt662 PMID: 24257187
  44. Powers DMW. Evaluation: From Precision, Recall and F-Measure to ROC., Informedness, Markedness & Correlation. Journal of Machine Learning Technologies. 2011; 2(1).
  45. Mogg K, Millar N, Bradley B. Biases in eye movements to threatening facial expressions in generalized anxiety disorder and depressive disorder. Journal of Abnormal Psychology. 2000; 109(4). doi: 10.1037/ 0021-843X.109.4.695 PMID: 11195993
  46. Goh J, Tan J, Park D. Culture Modulates Eye-Movements to Visual Novelty. PLoS ONE. 2009; 4(12). doi: 10.1371/journal.pone.0008238