June 2022
Volume 63, Issue 7
Open Access
ARVO Annual Meeting Abstract  |   June 2022
Functional Assessment of Hand-Eye Coordination in Individuals with Ultra-Low Vision Using Virtual Reality
Author Affiliations & Notes
  • Ravnit Singh
    Johns Hopkins University Zanvyl Krieger School of Arts and Sciences, Baltimore, Maryland, United States
  • Arathy Kartha
    Johns Hopkins Medicine, Baltimore, Maryland, United States
  • Roksana Sadeghi
    Johns Hopkins Medicine, Baltimore, Maryland, United States
  • Chau Tran
    BaltiVirtual, Maryland, United States
  • Thom Swanson
    BaltiVirtual, Maryland, United States
  • Chris Bradley
    Johns Hopkins Medicine, Baltimore, Maryland, United States
  • Gislin Dagnelie
    Johns Hopkins Medicine, Baltimore, Maryland, United States
  • Footnotes
    Commercial Relationships   Ravnit Singh None; Arathy Kartha None; Roksana Sadeghi None; Chau Tran None; Thom Swanson None; Chris Bradley None; Gislin Dagnelie None
  • Footnotes
    Support  NIH NEI R01EY028452
Investigative Ophthalmology & Visual Science June 2022, Vol.63, 4056 – F0020. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ravnit Singh, Arathy Kartha, Roksana Sadeghi, Chau Tran, Thom Swanson, Chris Bradley, Gislin Dagnelie; Functional Assessment of Hand-Eye Coordination in Individuals with Ultra-Low Vision Using Virtual Reality. Invest. Ophthalmol. Vis. Sci. 2022;63(7):4056 – F0020.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Ultra-Low Vision is defined as visual acuity ≤ 20/1600. Currently, there are no standardized tools available to assess hand-eye coordination in individuals with vision in the ULV range. The purpose of this study was to develop and validate a test to assess hand-eye coordination for people with ultra-low vision (ULV).

Methods : Eleven participants with ULV went through task stimulations requiring hand-eye coordination, such as flipping a light switch or placing silverware on a dinner table, using the VR Headset. Using a Leap Motion hand tracker, each participant was able to visualize their hand as they went about performing a task. The same task was presented at different visibility levels (high, medium, and low). Tasks were to be completed in 5 pre-defined steps (e.g., 5 defined steps to make a pancake). The results were then analyzed using the method of successive dichotomizations, a polytomous Rasch model, to estimate person measures (person ability) and item measures (task difficulty).

Results : Estimated item measures ranged from -3.49 to 3.97 logits with a mean of zero (by convention) and a SD = 1.8 logits (Fig 1). The most difficult item was sorting pills (3.97 logits) and the least difficult item was building a tower from high contrast blocks in decreasing size (-3.49 logits). Person measures ranged from -2.9 to 4.3 logits with a mean (SD) of 1.06 (2.09) logits. A t-test showed that the means of the distributions were not significantly different (p = 0.2). The items (tasks) were therefore well-targeted to the sample of persons.

Conclusions : The results show a mix of items with good spread of difficulty levels that can be used to assess hand-eye coordination in individuals with ULV with different levels of functional ability. We will continue testing in a larger sample of people using a reduced set of items by eliminating items with similar item measures to reduce redundancy and cut down testing times.

This abstract was presented at the 2022 ARVO Annual Meeting, held in Denver, CO, May 1-4, 2022, and virtually.

 

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×