July 2018
Volume 59, Issue 9
ARVO Annual Meeting Abstract  |   July 2018
The effect of stimulus type and autism quotient score on visual scanning of faces
Author Affiliations & Notes
  • Guan Nan Guo
    Optometry and Vision Sciences, University of Melbourne, Melbourne, Victoria, Australia
  • Larry A Abel
    Optometry and Vision Sciences, University of Melbourne, Melbourne, Victoria, Australia
  • Footnotes
    Commercial Relationships   Guan Guo, None; Larry Abel, None
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science July 2018, Vol.59, 4413. doi:https://doi.org/
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Guan Nan Guo, Larry A Abel; The effect of stimulus type and autism quotient score on visual scanning of faces. Invest. Ophthalmol. Vis. Sci. 2018;59(9):4413. doi: https://doi.org/.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose : Many social cues are expressed on the human face. Hence, the accurate gathering and processing of facial information are crucial for successful social interactions. Eye tracking can quantify the gaze behaviour; however, studies of this area have produced inconsistent results. We hypothesize that different forms of stimuli elicit different scanpaths of faces and that these can be influenced by traits captured by the autism quotient (AQ).

Methods : We studied 21 healthy participants aged of 18-27, (mean±SD 21.66±2.65). AQ scores ranged from 7-34, mean 18.9±5.58. All participants completed an AQ test before the experiment, and they were asked to view 5 still images, 5 recorded videos and had 2 live video conversations. Areas of interest (AOIs) were drawn on the eyes, nose and mouth regions of the face. Net dwell time (NDT)-the viewing time of an AOI over the total viewing time for each trial--was calculated and examined with ANOVA, paired sample t-tests and Pearson’s correlations.

Results : Stimulus type had a main effect on the viewing behaviour across all participants (F(1.673, 68.607) = 22.725, p <.001). Post hoc t-tests found significant differences in viewing pattern between still and video stimuli (t(40) = 5.7, p <.001), and between still and live stimuli (t(40) = 4.8, p <.001). Separating the AOIs revealed main effects of stimulus type for the nose (F(2,82) = 6.012, p =.004) and the mouth (F(2,82) = 8.385, p <.001). There was significant increase in NDT towards the nose for video (t(40) = 3, p =.014) and live stimuli (t(40) = 2.917, p =.018) compared to the still images. There was also significant increase in NDT on the mouth for video (t(40) = 3.714, p =.002) and live stimuli (t(40) = 2.607, p =.042) compared to the still images. We also observed a three-way interaction between AQ, AOI and stimulus type (F(3.115, 124.618) = 3.482 p =.017), and positive correlations between viewing time of the mouth in video (r(42) = 0.376, p =.014) and live stimuli (r(42) = 0.464 p =.002) with increasing AQ score.

Conclusions : Static images elicited significantly different scanpaths than either recorded or live videos. Participants looked more at the mouth and nose in both dynamic stimuli, suggesting the need for caution when inferring social interactions from static images. In addition, autistic traits influenced gaze behaviour differently depending on the stimulus type.

This is an abstract that was submitted for the 2018 ARVO Annual Meeting, held in Honolulu, Hawaii, April 29 - May 3, 2018.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.