July 2019
Volume 60, Issue 9
Open Access
ARVO Annual Meeting Abstract  |   July 2019
Virtual Opportunistic Reaction Perimetry (VORP)
Author Affiliations & Notes
  • Wolfgang Fink
    Ceeable Technologies Inc., Somerville, Massachusetts, United States
  • John Cerwin
    Ceeable Technologies Inc., Somerville, Massachusetts, United States
  • Chris Adams
    Ceeable Technologies Inc., Somerville, Massachusetts, United States
  • Footnotes
    Commercial Relationships   Wolfgang Fink, Ceeable Technologies Inc. (I), Ceeable Technologies Inc. (P), Ceeable Technologies Inc. (S); John Cerwin, Ceeable Technologies Inc. (I), Ceeable Technologies Inc. (P), Ceeable Technologies Inc. (S); Chris Adams, Ceeable Technologies Inc. (I), Ceeable Technologies Inc. (P), Ceeable Technologies Inc. (S)
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science July 2019, Vol.60, 4385. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Wolfgang Fink, John Cerwin, Chris Adams; Virtual Opportunistic Reaction Perimetry (VORP). Invest. Ophthalmol. Vis. Sci. 2019;60(9):4385.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : To introduce a campimetry-based visual field test - Virtual Opportunistic Reaction PerimetryTM (VORPTM; patents pending) - that, as opposed to standard automated perimetry (e.g., Humphrey Visual Field Analyzer, i.e., the current Gold Standard), does not require the subject to maintain fixation throughout the test exam, nor to actively acknowledge the perception of a light stimulus, e.g., by pushing a button or providing verbal feedback.

Methods : VORP is administered using a virtual reality (VR) headset with built-in high-frequency and high-accuracy eye/gaze tracking. The half screen of the VR headset for the eye not being tested is turned off. For the eye being tested, the gaze location is constantly recorded via the eye/gaze tracking. The subject is asked to pursue perceived light stimuli that are being presented at pseudo-random locations (i.e., locations determined by the VORP testing program) within the field of view of the VR-helmet and subject. Given a current gaze location, VORP opportunistically calculates the location of a light stimulus for subsequent presentation within the available visual real estate of the VR helmet for an eccentricity/location that has not been covered at all or not sufficiently covered yet during a test session. If the subject perceives that stimulus and moves their gaze towards its location, VORP can exploit this to present a stimulus at a farther, same, or closer distance/eccentricity from the current gaze location. Thus, over time, a user-defined visual field area can be tested at a user-defined or automatically defined resolution.

Results : Preliminary testing with VORP revealed that at least the following information can be extracted during a VORP visual field testing session (Figs. 1, 2): stimulus eccentricity; trajectory, direction, and velocity of eye/gaze change, and achieved accuracy of focusing on the stimulus (if perceived).

Conclusions : The overall VORP test time depends on how quickly VORP completes a visual field screening given the subject’s eye movement during the test. The clinical value of VORP is directly dependent on the spatio-temporal accuracy of the eye/gaze tracking. The advantages over standard automated perimetry are that neither fixation nor active feedback is required of the subject. VORP can opportunistically and automatically adjust the spatio-temporal stimulus arrangement in real time.

This abstract was presented at the 2019 ARVO Annual Meeting, held in Vancouver, Canada, April 28 - May 2, 2019.

 

Gaze trajectory (red) upon stimulus presentation (blue)

Gaze trajectory (red) upon stimulus presentation (blue)

 

All gaze trajectories obtained with VORP

All gaze trajectories obtained with VORP

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×