April 2009
Volume 50, Issue 13
Free
ARVO Annual Meeting Abstract  |   April 2009
µAVS2: Microcomputer-Based Artificial Vision Support System for Real-Time Image Processing for Camera-Driven Visual Prostheses
Author Affiliations & Notes
  • W. Fink
    Visual & Autonomous Exploration Systems Research Laboratory, California Institute of Technology, Pasadena, California
  • M. A. Tarbell
    Visual & Autonomous Exploration Systems Research Laboratory, California Institute of Technology, Pasadena, California
  • Footnotes
    Commercial Relationships  W. Fink, Caltech, P; M.A. Tarbell, Caltech, P.
  • Footnotes
    Support  NSF Grant EEC-0310723
Investigative Ophthalmology & Visual Science April 2009, Vol.50, 4748. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      W. Fink, M. A. Tarbell; µAVS2: Microcomputer-Based Artificial Vision Support System for Real-Time Image Processing for Camera-Driven Visual Prostheses. Invest. Ophthalmol. Vis. Sci. 2009;50(13):4748.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose: : To provide a standalone, battery-powered, portable microcomputing platform and software system for real-time image processing to enhance visual perception and provide independent mobility for the blind using camera-driven visual prostheses.

Methods: : As it is difficult to predict exactly what blind subjects with camera-driven visual prostheses (e.g., retinal implants) may be able to perceive, it is prudent to offer them a wide variety of image processing filters and the capability to engage these filters repeatedly in any user-defined order to enhance their visual perception. The Artificial Vision Simulator (AVS; Fink and Tarbell, ARVO 2005) performs real-time (e.g., 30 fps) image processing/enhancement of digital camera image streams. AVS provides the unique ability and flexibility for visual prosthesis carriers to further fine-tune, optimize, and customize their individual visual perception afforded by their respective vision systems, by actively manipulating parameters of individual image processing filters, even altering the sequence of these filters. To attain true portability, the laptop computer-based AVS processing system was redesigned as the Microcomputer-based Artificial Vision Support System, µAVS2, a truly standalone, battery-operated, portable microcomputing platform for real-time image processing.

Results: : To create µAVS2, the core AVS functionality (i.e., repeated application of image processing filters in any user-defined order) was entirely preserved, yet its architecture was completely redesigned to operate in a linear, sequential-loop fashion, resulting in vastly reduced memory and CPU requirements during execution, making the use of a microprocessor possible. We employed a commercial-off-the-shelf, battery-powered, general purpose miniaturized Linux processing platform. Smaller than a deck of playing cards, it is lightweight, fast (600MHz clock speed), and equipped with USB and Ethernet interfaces. µAVS2 imports raw video frames from a USB camera, processes them through user-selected image filters, and then issues them over an outbound Internet TCP/IP connection to the visual prosthesis system in real time.

Conclusions: : µAVS2 provides real-time image processing for artificial vision systems while maintaining portability, and thus independence for its users. Despite its small size, µAVS2 is a general purpose computing platform and can easily be reconfigured for other prosthetic systems. Testing of µAVS2 with actual retinal implant carriers is envisioned in the near future.

Keywords: image processing • low vision • quality of life 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×