June 2023
Volume 64, Issue 8
Open Access
ARVO Annual Meeting Abstract  |   June 2023
Validation of eye tracking-based visual tests using virtual reality headset
Author Affiliations & Notes
  • Luis De Sisternes
    Twenty/Twenty Therapeutics LLC, California, United States
  • Diksha Goyal
    Twenty/Twenty Therapeutics LLC, California, United States
  • Dimitri T Azar
    Ophthalmology, University of Illinois Chicago College of Medicine, Chicago, Illinois, United States
    Twenty/Twenty Therapeutics LLC, California, United States
  • Supriyo Sinha
    Twenty/Twenty Therapeutics LLC, California, United States
  • Footnotes
    Commercial Relationships   Luis De Sisternes Twenty/Twenty Therapeutics LLC, Code E (Employment); Diksha Goyal Twenty/Twenty Therapeutics LLC, Code E (Employment); Dimitri Azar Twenty/Twenty Therapeutics LLC, Code E (Employment); Supriyo Sinha Twenty/Twenty Therapeutics LLC, Code E (Employment)
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science June 2023, Vol.64, 2844. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Luis De Sisternes, Diksha Goyal, Dimitri T Azar, Supriyo Sinha; Validation of eye tracking-based visual tests using virtual reality headset. Invest. Ophthalmol. Vis. Sci. 2023;64(8):2844.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Virtual Reality (VR) headsets are becoming attractive for conducting different visual tests, potentially decreasing human effort, time and costs while offering quantifiable and repeatable results instead of subjective descriptions. We validate the application of a VR system that uses only eye telemetry data to provide routine visual test measurements in a diverse population.

Methods : We conducted a series of ophthalmic measurements using a VS Headset (Twenty/Twenty Therapeutics LLC), an eye tracking-based ophthalmic device with custom software that presents a series of time and response-dependent stimuli within the headset’s field of view while capturing eye telemetry data. These measurements allowed the generation of pupillometry, eye motility, alignment, color vision, contrast sensitivity and screening visual field parameters relying only on the user’s eye response to the presented stimuli without additional user input. The measurements for each visual test were repeated within a session or across repeated sessions for each eye to derive their repeatability and calculate their uncertainty. The success of a test for a given eye was determined by setting high standards of certainty to deem the test valid. We analyzed each test’s success rate and repeatability.

Results : Data collected from 162 eyes from 81 subjects in routine clinical care were analyzed. Pupillometry measurements provided an intra-test variability of pupil diameter response at both dark and bright stimuli of 0.2 mm standard deviation (std). Eye motility measurements yielded an inter-test variability of range of motion for each of the six extraocular muscles within 2.4 degrees (95% confidence interval). Eye alignment measurements provided intra-test variability measuring prism diopter displacement to alternating cover/uncover response within 1.1 prism diopter std. Visual field tests yielded an intra-test variability measuring sensitivity within an outer ring neighborhood of 1.26 dB std. Color vision and contrast sensitivity measurements showed good correlation with known clinical tests.

Conclusions : This work validates the use of a VR system to conduct a series of routine visual measurements based only on eye responses to presented stimuli. The system performs reliable measurements with high success while maintaining high reproducibility, and presents an alternative to standard visual tests while having the advantage of providing quantifiable results.

This abstract was presented at the 2023 ARVO Annual Meeting, held in New Orleans, LA, April 23-27, 2023.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×