April 2014
Volume 55, Issue 13
Free
ARVO Annual Meeting Abstract  |   April 2014
Detecting vision loss in glaucoma using eye movement scanpaths recorded during free viewing of movies - a proof of principle study
Author Affiliations & Notes
  • Nicholas David Smith
    Optometry and Visual Science, City University London, London, United Kingdom
  • Haogang Zhu
    Optometry and Visual Science, City University London, London, United Kingdom
  • David Paul Crabb
    Optometry and Visual Science, City University London, London, United Kingdom
  • Footnotes
    Commercial Relationships Nicholas Smith, None; Haogang Zhu, None; David Crabb, None
  • Footnotes
    Support None
Investigative Ophthalmology & Visual Science April 2014, Vol.55, 5637. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Nicholas David Smith, Haogang Zhu, David Paul Crabb; Detecting vision loss in glaucoma using eye movement scanpaths recorded during free viewing of movies - a proof of principle study. Invest. Ophthalmol. Vis. Sci. 2014;55(13):5637.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
 
Purpose
 

Vision is typically assessed by asking people to respond to synthetic stimuli whilst maintaining steady fixation. We test the hypothesis that vision loss in glaucoma can be detected by examining patterns of eye movement recorded whilst a person naturally watches a movie.

 
Methods
 

Thirty-two elderly people with normal vision (median age: 70, interquartile range [IQR] 64 to 75 years) and 44 patients with a clinical diagnosis of glaucoma (median age: 69, IQR 63 to 77 years) had standard vision examinations including automated perimetry. All participants then viewed three unmodified TV and film clips on a computer set up incorporating the Eyelink 1000 eyetracker (SR Research, Ontario, Canada). Scanpaths were plotted using purpose-written software that first filtered the data and then generated saccade density maps (see Figure). Maps were then subjected to a feature extraction analysis using kernel principal component analysis (KPCA). Features from the KPCA were then classified using a standard machine based classifier. The classification was trained and tested by a 10 fold cross validation which was repeated 100 times to estimate the confidence interval (CI) of classification hit rate and specificity.

 
Results
 

Patients had a range of glaucoma disease severity (median worse eye MD of -11.7 dB, IQR -17.1 to -5.9 dB). Average hit rate for correctly identifying a glaucoma patient at a fixed specificity of ~90% was 79% (95% CI: 67 to 93%). The area under the Receiver Operating Characteristic curve was 0.84 (95% CI: 0.79 to 0.92).

 
Conclusions
 

Huge data from scanpaths of eye movements recorded whilst people freely watch videos can be processed into maps that contain a signature of vision loss. In this proof of principle study we have demonstrated that a group of patients with a range of glaucomatous vision loss can be reasonably well separated from a group of peers with normal vision by considering these eye movement signatures alone. Estimates of ‘diagnostic precision’ for this approach are similar to what would be found in using a single result from a modern imaging device.

 
 
Each row in this schematic illustrates a different stage in the process of generating the saccadic map for one video viewed by one person. Each column represents a different time point in the process. (The final saccade map is shown in the bottom right corner of the schematic).
 
Each row in this schematic illustrates a different stage in the process of generating the saccadic map for one video viewed by one person. Each column represents a different time point in the process. (The final saccade map is shown in the bottom right corner of the schematic).
 
Keywords: 522 eye movements • 758 visual fields  
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×