May 2005
Volume 46, Issue 13
Free
ARVO Annual Meeting Abstract  |   May 2005
Neuroengineering Tools for the Design and Test of Visual Neuroprostheses
Author Affiliations & Notes
  • E. Fernandez
    Instituto de Bioingeniería, Universidad Miguel Hernandez, San Juan de Alicante, Spain
  • C. Morillas
    Arquitectura y Tecnologia de Computadores, Universidad de Granada, Granada, Spain
  • S. Romero
    Arquitectura y Tecnologia de Computadores, Universidad de Granada, Granada, Spain
  • A. Martínez
    Arquitectura y Tecnologia de Computadores, Universidad de Granada, Granada, Spain
  • F. Pelayo
    Arquitectura y Tecnologia de Computadores, Universidad de Granada, Granada, Spain
  • Footnotes
    Commercial Relationships  E. Fernandez, None; C. Morillas, None; S. Romero, None; A. Martínez, None; F. Pelayo, None.
  • Footnotes
    Support  QLK6–CT–2001–00279, TIC2003–09557–CO2–02, DPI2004–07032
Investigative Ophthalmology & Visual Science May 2005, Vol.46, 1483. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      E. Fernandez, C. Morillas, S. Romero, A. Martínez, F. Pelayo; Neuroengineering Tools for the Design and Test of Visual Neuroprostheses . Invest. Ophthalmol. Vis. Sci. 2005;46(13):1483.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Abstract: : Purpose: We present a suite of software/hardware tools conceived to (a) easily design retina–like encoders of visual information, (b) simulate them and check their response against biological recordings, (c) automatically build reconfigurable electronic circuits that implement the encoders in real time, and (d) a platform to tune and adjust prosthesis parameters for implanted individuals. Methods: The system is composed by a set of bio–inspired visual pre–processing modules and a spike–coding block based on a leaky integrate and fire model of spiking neurons. The system is able to encode the visual information captured by a standard camera, identify the target electrodes and produce optimal codes for stimulation. The model is highly parameterized. A communication scheme of address–event representation is used to finally send the output pulses to the implanted microelectrodes. The systems works on a PC for design and simulation, but it can be automatically synthesized into a single chip for portable real–time stimulation. Results: We successfully developed and tested a real–time retina–like encoder which contains a highly flexible retina model, with maximum independence of lighting conditions and able to put into each microelectrode the biggest amount of "usable information" that is possible. The input from a camera can be configured with irregular distributions of photoreceptors, and processed by a set of programmable spatio–temporal filters, including multichannel color–contrast enhancement. This retina model, complemented with the spiking neural module, forms an "artificial retina" that translates image sequences into a continuous stream of short pulses representing spike events. Comparing the system output with multielectrode recordings from biological retinas (human, rabbit, turtle, rat), similar patterns generated by the artificial coding module in response to the same types of stimuli are observed. Conclusions: Through a set of parameterized filters and functions, we obtain a portable model that can be automatically translated into portable autonomous hardware. Although the model is essentially analog, we have chosen a digital implementation to have a more flexible and standard approach, which can be easily customized for each implanted patient through a graphical user interface. The whole system facilitates the diagnosis, design and test of visual neuroprostheses.

Keywords: visual impairment: neuro-ophthalmological disease • low vision 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×