May 2005
Volume 46, Issue 13
Free
ARVO Annual Meeting Abstract  |   May 2005
Visual Performance With Images Spectrally Augmented by Infrared: A Tool for Severely Impaired and Prosthetic Vision
Author Affiliations & Notes
  • G. Dagnelie
    Lions Vision Center, Johns Hopkins University, Baltimore, MD
  • S. Kalpin
    Advanced Medical Electronics Corp., Minneapolis, MN
  • L. Yang
    Lions Vision Center, Johns Hopkins University, Baltimore, MD
  • G. Legge
    Psychology, University of Minnesota, Minneapolis, MN
  • Footnotes
    Commercial Relationships  G. Dagnelie, None; S. Kalpin, Advanced Medical Electronics Corp. E; L. Yang, None; G. Legge, None.
  • Footnotes
    Support  EY014727, EY12843
Investigative Ophthalmology & Visual Science May 2005, Vol.46, 1490. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      G. Dagnelie, S. Kalpin, L. Yang, G. Legge; Visual Performance With Images Spectrally Augmented by Infrared: A Tool for Severely Impaired and Prosthetic Vision . Invest. Ophthalmol. Vis. Sci. 2005;46(13):1490.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Abstract: : Purpose: Individuals with severe visual impairment and future visual prosthesis wearers derive little or no chromatic information from their visual input, but may be assisted by spectral information outside the visible range. We are using a prototype device, developed under an SBIR to AME Corp, combining a video camera and a micro–bolometer sensor to allow these individuals access to far infrared (heat) as well as visible image information. Methods: Normally–sighted subjects were tested by viewing video clips, either pixelized (i.e., simulated prosthetic) to 6x10 dots with 2° separation or low pass filtered at 0.5 c/°, containing varying (100:0, 75:25, 50:50, 25:75, and 0:100%) proportions of visible and IR information. In a first series of tests their task was to identify liquid–filled cups differing in temperature, liquid level or shape. In a second series of tests, the task was to make judgments about people present in a room or walking through a scene, individuals’ gait and posture, and to locate empty seats in a conference room. Performance was scored in terms of accuracy (% correct) and response time. Results: Cup identification tests demonstrated that temperature information can be a highly reliable factor in the decision–making process. If distinguishing information was primarily provided in the form of temperature differences, discriminations were performed at chance in visible light whereas they were easily and quickly performed if the IR–to–visible ratio was at least 50:50. This was true for objects with temperatures above as well as below room temperature, whereas the discrimination of objects at room temperature that were distinguishable in visible light fell to chance when the IR–to–visible ratio exceeded 50:50. In preliminary tests, judgment of scenes involving human actions benefited from a combination of visible and IR information, with a 50:50 ratio as the optimum. We are expanding this series of experiments and including subjects with severe low vision. Conclusions: Our tests indicate that temperature/heat information may contribute to the identification of objects and persons when the visible image provides insufficient discriminatory information. Such infrared–augmented imagery may aid people with severe low vision and future visual prosthesis wearers in the performance of daily tasks. We are proceeding with further tests and intend to develop a more compact prototype in a phase II SBIR to AME Corp.

Keywords: low vision • space and scene perception • retinal degenerations: hereditary 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×