May 2004
Volume 45, Issue 13
Free
ARVO Annual Meeting Abstract  |   May 2004
Effects of image stabilization on face recognition and virtual mobility using simulated prosthetic vision
Author Affiliations & Notes
  • G. Dagnelie
    Lions Vision Center, Johns Hopkins University, Baltimore, MD
  • A.J. Kelley
    Lions Vision Center, Johns Hopkins University, Baltimore, MD
  • L. Yang
    Lions Vision Center, Johns Hopkins University, Baltimore, MD
  • Footnotes
    Commercial Relationships  G. Dagnelie, None; A.J. Kelley, None; L. Yang, None.
  • Footnotes
    Support  EY12843 and Foundation Fighting Blindness
Investigative Ophthalmology & Visual Science May 2004, Vol.45, 4223. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      G. Dagnelie, A.J. Kelley, L. Yang; Effects of image stabilization on face recognition and virtual mobility using simulated prosthetic vision . Invest. Ophthalmol. Vis. Sci. 2004;45(13):4223.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Abstract: : Purpose: Prosthetic vision recipients will experience stabilized images as they move their eyes and, if a head–mounted camera is used, will execute an inappropriate vestibulo–ocular reflex as they turn their head. This study investigates the impact of these anomalies on two tasks representative of daily visual activities, and possible ways to adapt to them. Methods: Normally sighted observers wearing a video headset with pupil–tracking capability were asked to perform two tasks in pixelized vision, using 10x10 to 32x32 dot grids with varying dot parameters and 100% or 12.5% contrast: 1. Identification of a facial image from among 4 possible (non–pixelized) matches; the 4 faces were selected from a library of 60 models, and were matched by race and gender. 2. Mobility through a series of rooms in a virtual building with varying floor plans. Each subject performed both tasks in free–viewing and stabilized conditions; in the stabilized condition dots move with the eye position, and inspection of the image is accomplished by scrolling it across the dots using the computer mouse. All conditions except contrast levels were interspersed in pseudo–random order to distribute any learning effects equally among all conditions. All subjects had ample experience in viewing pixelized images. Results: Similar to a recent report regarding performance under free–viewing conditions (Thompson et al., IOVS 44, 5035–42), subjects showed considerable variability in face recognition accuracy and timing (both error rates and response times varied by factors of 2–4), but all subjects performed well above chance (25%) for all but a few parameter settings. Contrary to reading performance (Kelley et al, ARVO 2004), however, performance of this task was only minimally affected by image stabilization: Both accuracy and response times were equally distributed in free–viewing and stabilized conditions. Mobility performance (travel time and number of hits) in a virtual space was affected by image stabilization, but still well less so than reading. In this test, too, variability between subjects is considerable. We are expanding testing to visually impaired subjects. Conclusions: Future visual prosthesis wearers are likely to be differentially impaired on different tasks by the effects of phosphene stabilization: Mobility and social interactions are likely to be acquired more rapidly than more visually demanding tasks such as reading, but performance of these tasks is more likely to result in mixed rehabilitative success of epi–retinal and cortical prostheses.

Keywords: low vision • face perception • motion–2D 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×