June 2021
Volume 62, Issue 8
Open Access
ARVO Annual Meeting Abstract  |   June 2021
Hand-Eye Coordination in Virtual Reality under Simulated Ultra-Low Vision Conditions
Author Affiliations & Notes
  • Gislin Dagnelie
    Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States
  • Arathy Kartha
    Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States
  • Roksana Sadeghi
    Biomedical Engineering, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States
  • Soo Hyun Lee
    Ophthalmology, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States
  • Thom Swanson
    BaltiVirtual LLC, Baltimore, Maryland, United States
  • Will Gee
    BaltiVirtual LLC, Baltimore, Maryland, United States
  • Footnotes
    Commercial Relationships   Gislin Dagnelie, Astellas Pharma (P); Arathy Kartha, None; Roksana Sadeghi, None; Soo Hyun Lee, None; Thom Swanson, None; Will Gee, None
  • Footnotes
    Support  R01 EY028452
Investigative Ophthalmology & Visual Science June 2021, Vol.62, 3575. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gislin Dagnelie, Arathy Kartha, Roksana Sadeghi, Soo Hyun Lee, Thom Swanson, Will Gee; Hand-Eye Coordination in Virtual Reality under Simulated Ultra-Low Vision Conditions. Invest. Ophthalmol. Vis. Sci. 2021;62(8):3575.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : To develop calibrated measures of hand-eye coordination for ULV individuals in virtual reality (VR), allowing objective quantification of visual ability in realistic activities of daily living (ADLs).

Methods : Based on an inventory of ADLs valued by ULV individuals (Adeyemo et al., TVST 2017) and prior data from visual information gathering ADLs in VR (Kartha et al., ARVO 2019/2020), we created 20 scenes with hand-eye coordination ADLs; examples include locating and flipping a light switch, giving a high five to an avatar, picking up common objects, building a block tower, sorting pills, putting on a mitt, and baking cookies and a pancake. Most scenes were implemented at 3 visibility levels by varying contrast or size, for a total of 55 activities. Scenes were presented in a Vive Pro Eye VR headset using a Leap Motion hand tracker to visualize the subject's hand. Subjects were allowed to practice manipulating objects in normal vision (NV). Simulated ULV (sULV) was achieved in these normally sighted subjects through Bangerter foils, reducing visual acuity to 2.0 LogMAR. Performance was compared across observers and across vision status (NV vs. sULV).

Results : All four subjects were able to complete 98% of activities in NV and 81% in sULV. Completion times in NV were 4.0 s [2.0,9.7] (median[IQR]), in sULV 6.4 s [3.7,17.8]; 67% of completed activities required (on average 33%) more time in sULV than in NV, with 20 of 55 requiring significantly more time (by ANOVA);. Rank correlations of completion times between NV and sULV within observers ranged from 0.62 to 0.71, suggesting that task difficulties were unequally affected by vision degradation. Rank correlations between observers ranged from 0.81 to 0.89 for NV, and from 0.65 to 0.89 for sULV, suggesting subjects were unequally affected, despite equal vision degradation levels.

Conclusions : Most of these activities, representative of ADLs valued by individuals with ULV, could be completed by individuals with sULV equivalent to 20/2000. The wide range of completion times in sULV suggests that these activities cover a broad difficulty range, required to cover the full spectrum of ULV. They will be validated in our population with ULV due to a wide variety of conditions.

This is a 2021 ARVO Annual Meeting abstract.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×