June 2022
Volume 63, Issue 7
Open Access
ARVO Annual Meeting Abstract  |   June 2022
Visual Wayfinding in people with Ultra Low Vision using Virtual Reality
Author Affiliations & Notes
  • Arathy Kartha
    Ophthalmology, Johns Hopkins University, Baltimore, Maryland, United States
  • Roksana Sadeghi
    Ophthalmology, Johns Hopkins University, Baltimore, Maryland, United States
    Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland, United States
  • Thom Swanson
    BaltiVirtual, Baltimore, Maryland, United States
  • Gislin Dagnelie
    Ophthalmology, Johns Hopkins University, Baltimore, Maryland, United States
  • Footnotes
    Commercial Relationships   Arathy Kartha None; Roksana Sadeghi None; Thom Swanson None; Gislin Dagnelie None
  • Footnotes
    Support  NIH R01EY028452
Investigative Ophthalmology & Visual Science June 2022, Vol.63, 4215 – A0143. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Arathy Kartha, Roksana Sadeghi, Thom Swanson, Gislin Dagnelie; Visual Wayfinding in people with Ultra Low Vision using Virtual Reality. Invest. Ophthalmol. Vis. Sci. 2022;63(7):4215 – A0143.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose : People with ultra-low vision (ULV) use echolocation and white cane for navigation and wayfinding. There is very little information about how their limited vision can be useful in efficient and safe navigation in unfamiliar settings and environments. The purpose of this study was to develop and calibrate a virtual reality tool for assessing visual wayfinding in people with ULV.

Methods : 15 participants with ULV completed wayfinding tasks in three settings – a street crossing, a cafeteria, and a metro station in virtual reality. Each of these scenarios were presented under different levels of visual clutter to present a range of visual demands and cognitive loads. Task completion times and number of collisions were recorded. Habitual cane users (n=7) were tested with and without cane. Visual acuity was estimated for all participants using the Berkeley Rudimentary Vision Test.

Results : The mean number of collisions ranged from 0.7 to 5.3 among participants across different scenes. The number of collisions increased with increasing clutter and there was a significant association between the number of collisions and estimated visual acuity (R2 = 0.6, p<0.05). Mean task completion times ranged from 25.6 to 87.2s across different scenes and was significantly associated with estimated visual acuities (R2 = 0.75, p< 0.005). We did not observe a speed vs. accuracy trade-off probably because participants were asked to walk at their normal walking speeds. Participants had more total number of collisions without cane (80) than with cane (54) which was statistically significant (p<0.05). Similarly, the mean task completion times were higher for without cane (478.2s) compared to with cane (342.9 s) which was also statistically significant (p<0.05).

Conclusions : Overall, we found that performance in the wayfinding test presented in virtual reality was consistent with visual acuity of our participants with ULV and could be used as a reliable functional mobility assessment for ULV once calibrated. This test could also be used as a rehabilitation tool to improve wayfinding in people with ULV by training to reduce the number of collisions in the safety of virtual environment. Even though there were no real obstacles to detect, we found that using a cane provided an advantage for habitual users.

This abstract was presented at the 2022 ARVO Annual Meeting, held in Denver, CO, May 1-4, 2022, and virtually.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.