June 2013
Volume 54, Issue 15
Free
ARVO Annual Meeting Abstract  |   June 2013
Validation of a Novel Fast and Robust Automated Technique for Calculation of the EOG Arden Ratio, Using R
Author Affiliations & Notes
  • Marc Sarossy
    ODC, Royal Victorian Eye and Ear Hospital, East Melbourne, VIC, Australia
    CERA, University of Melbourne, Melbourne, VIC, Australia
  • Matthew Lee
    Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia
  • Michael Bach
    Universitäts-Augenklinik, Freiburg, Germany
  • Footnotes
    Commercial Relationships Marc Sarossy, None; Matthew Lee, None; Michael Bach, None
  • Footnotes
    Support None
Investigative Ophthalmology & Visual Science June 2013, Vol.54, 6137. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marc Sarossy, Matthew Lee, Michael Bach; Validation of a Novel Fast and Robust Automated Technique for Calculation of the EOG Arden Ratio, Using R. Invest. Ophthalmol. Vis. Sci. 2013;54(15):6137.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
 
Purpose
 

The EOG is a clinically useful test but demanding of much technician post-processing time. In this study we develop and validate an automated fast method of calculating the Arden Ratio from EOG recordings using R.

 
Methods
 

Raw EOG traces were exported from the system in the visual electrophysiology laboratory of the Royal Victorian Eye and Ear Hospital. Third order harmonic approximations to square waves were fitted as models via a non-linear least square (NLS) algorithm. A number of refinements to the fitting methodology including robust regression (RR) were also tested. Calculations were performed in the open source statistical language R using a routine developed by the authors. Arden ratios were calculated by fitting third degree polynomials to model amplitudes in the dark and light phases, and subsequently taking the ratio of the light peak to dark trough.

 
Results
 

Test performance of the methods were compared against manually calculated performance in 54 eyes of 27 patients with a variety of ophthalmic diagnoses. The performance of the test was evaluated by calculating the correlation of individual automatically measured trace amplitudes with the manual measured amplitudes. We also evaluated the classification accuracy of the test for each eye by comparing classification into normal and abnormal based on the derived Arden Ratio and a cutoff of the ratio of 200%. Of the refined methods, the RR method proved the best. The NLS and RR individual run amplitude measurements correlated closely with the manual method (NLS R2 = 0.9817, p < 0.01; RR R2 = 0.9877, p < 0.01). Classification accuracy of the automated methods was good compared to the manual (“gold standard”) : NLS sensitivity 93% specificity 85% and RR sensitivity 96% and specificity 81%. Technician processing time was reduced from 20 to 2 minutes. Advantages of this algorithm: no saccades need be detected and the information of the fully determined time course of the gaze target is optimally utilized making it robust to irregular gaze changes.

 
Conclusions
 

The results validate this novel automated method against the manual method, while saving considerable technician time. The authors will make this available as part of an open source R package.

 
 
The raw data for a sample run with the Espion’s automated marker placement and the nls fit. Note that the method is robust to the saccadic overshoot
 
The raw data for a sample run with the Espion’s automated marker placement and the nls fit. Note that the method is robust to the saccadic overshoot
 
Keywords: 507 electrophysiology: clinical • 473 computational modeling • 524 eye movements: recording techniques  
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×