Purchase this article with an account.
Larry A Abel, Sarah Case, Aimee Whiteside, Jennifer Duong, Diana Chau, Eric Ng; Scanning of upright, horizontal and inverted faces during emotion recognition. Invest. Ophthalmol. Vis. Sci. 2016;57(12):4581.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
While the face inversion effect makes the recognition of inverted faces more difficult, less is known about its effect on emotion identification and, even less about the effects of 90 deg rotation. We hypothesised that scanning would be altered when viewing rotated faces expressing specific emotions and that this would correlate with changes in emotion recognition.
22 normal subjects aged 21-26 (16 female, 6 male) viewed a set of 5 faces from the Karolinska Directed Emotional Faces database. Images expressed the emotions happy, sad, angry, disgusted scared and surprised. Areas of interest (AOIs) were defined for the right and left eyes (RE & LE), nasion (NA), nose (NO) and mouth (MO). Gaze was recorded with a Tobii 1750 eye tracker. Images were presented for 5 s at rotations of 0, 90, 180 and 270 from upright. Gaze time for each AOI was recorded. This was analysed with a 3-way repeated measures ANOVA with AOI, emotion and orientation as independent variables. Emotion detection accuracy was analysed with 1-way repeated measures ANOVA for each emotion.
There was no main effect of emotion on gaze time (p=NS). There were significant main effects for orientation (p<.001) and AOI (p<.001). The table shows the familiar eyes and nose fixation pattern in the upright case but for other orientations, whichever feature was uppermost was the most fixated-upon. Interactions were also significant for emotion*AOI (p<.001) and orientation*AOI (p<.001), as well as the 3-way interaction (p<.05). Rotation affected emotion detection differently for the different emotions: it had no effect for either happy or angry faces (p=NS) but significantly affected sad (p<.05), scared (p<.05) , disgusted (p<.001) and surprised (p<.001) faces (Figure).
Image orientation had marked effects on scanpaths, with whichever facial feature was uppermost receiving the most gaze time. This disruption of the familiar “inverted triangle” pattern seen when viewing faces did not have a consistent effect on subjects’ ability to identify emotions, with some being unaffected and one (surprised) being better identified when rotated. A particular fixation pattern thus appears to be unnecessary for accurate emotion identification and the usual dominance of the eyes in face scanning may result from their being in the upper visual field as much as from their information content.
This is an abstract that was submitted for the 2016 ARVO Annual Meeting, held in Seattle, Wash., May 1-5, 2016.
%correct identification of emotions at each orientation.
This PDF is available to Subscribers Only