September 2016
Volume 57, Issue 12
Open Access
ARVO Annual Meeting Abstract  |   September 2016
Scanning of upright, horizontal and inverted faces during emotion recognition
Author Affiliations & Notes
  • Larry A Abel
    Optometry & Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
  • Sarah Case
    Optometry & Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
  • Aimee Whiteside
    Optometry & Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
  • Jennifer Duong
    Optometry & Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
  • Diana Chau
    Optometry & Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
  • Eric Ng
    Optometry & Vision Sciences, University of Melbourne, Parkville, Victoria, Australia
  • Footnotes
    Commercial Relationships   Larry Abel, None; Sarah Case, None; Aimee Whiteside, None; Jennifer Duong, None; Diana Chau, None; Eric Ng, None
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science September 2016, Vol.57, 4581. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Larry A Abel, Sarah Case, Aimee Whiteside, Jennifer Duong, Diana Chau, Eric Ng; Scanning of upright, horizontal and inverted faces during emotion recognition. Invest. Ophthalmol. Vis. Sci. 2016;57(12):4581.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : While the face inversion effect makes the recognition of inverted faces more difficult, less is known about its effect on emotion identification and, even less about the effects of 90 deg rotation. We hypothesised that scanning would be altered when viewing rotated faces expressing specific emotions and that this would correlate with changes in emotion recognition.

Methods : 22 normal subjects aged 21-26 (16 female, 6 male) viewed a set of 5 faces from the Karolinska Directed Emotional Faces database. Images expressed the emotions happy, sad, angry, disgusted scared and surprised. Areas of interest (AOIs) were defined for the right and left eyes (RE & LE), nasion (NA), nose (NO) and mouth (MO). Gaze was recorded with a Tobii 1750 eye tracker. Images were presented for 5 s at rotations of 0, 90, 180 and 270 from upright. Gaze time for each AOI was recorded. This was analysed with a 3-way repeated measures ANOVA with AOI, emotion and orientation as independent variables. Emotion detection accuracy was analysed with 1-way repeated measures ANOVA for each emotion.

Results : There was no main effect of emotion on gaze time (p=NS). There were significant main effects for orientation (p<.001) and AOI (p<.001). The table shows the familiar eyes and nose fixation pattern in the upright case but for other orientations, whichever feature was uppermost was the most fixated-upon. Interactions were also significant for emotion*AOI (p<.001) and orientation*AOI (p<.001), as well as the 3-way interaction (p<.05). Rotation affected emotion detection differently for the different emotions: it had no effect for either happy or angry faces (p=NS) but significantly affected sad (p<.05), scared (p<.05) , disgusted (p<.001) and surprised (p<.001) faces (Figure).

Conclusions : Image orientation had marked effects on scanpaths, with whichever facial feature was uppermost receiving the most gaze time. This disruption of the familiar “inverted triangle” pattern seen when viewing faces did not have a consistent effect on subjects’ ability to identify emotions, with some being unaffected and one (surprised) being better identified when rotated. A particular fixation pattern thus appears to be unnecessary for accurate emotion identification and the usual dominance of the eyes in face scanning may result from their being in the upper visual field as much as from their information content.

This is an abstract that was submitted for the 2016 ARVO Annual Meeting, held in Seattle, Wash., May 1-5, 2016.

 

 

%correct identification of emotions at each orientation.

%correct identification of emotions at each orientation.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×