June 2017
Volume 58, Issue 8
Open Access
ARVO Annual Meeting Abstract  |   June 2017
Simplified detection of ON/OFF receptive fields using a nonlinear input model
Author Affiliations & Notes
  • Hope Shi
    Biology, University of Maryland, College Park, Maryland, United States
  • Alexandra Boukhvalova
    Biology, University of Maryland, College Park, Maryland, United States
  • Daniel Butts
    Biology, University of Maryland, College Park, Maryland, United States
  • Joshua H Singer
    Biology, University of Maryland, College Park, Maryland, United States
  • Footnotes
    Commercial Relationships   Hope Shi, None; Alexandra Boukhvalova, None; Daniel Butts, None; Joshua Singer, None
  • Footnotes
    Support  University of Maryland, Department of Biology, Brain and Behavior Initiative seed grant
Investigative Ophthalmology & Visual Science June 2017, Vol.58, 2579. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hope Shi, Alexandra Boukhvalova, Daniel Butts, Joshua H Singer; Simplified detection of ON/OFF receptive fields using a nonlinear input model. Invest. Ophthalmol. Vis. Sci. 2017;58(8):2579.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Retinal ganglion cells (GCs) often are classified as ON, OFF, or ON/OFF based on their responses to simple stimuli, like full-field, square wave flashes. Such analysis, however, fails to capture the computations performed within the receptive fields (RFs) of the GCs because the stimuli are spatially and temporally uniform. This problem often is addressed by the use of white noise stimuli, which permit the RFs of GCs to be mapped and the responses of the GCs to be described with linear-nonlinear (LN) models. LN analysis fails to describe the most numerous GCs in the rodent retina- ON/OFF cells, because it treats the circuitry presynaptic to the GC as a single input. Here, we use a spatially correlated noise stimulus combined with analysis by a novel, nonlinear input model (NIM) to: 1) demonstrate a methodology for characterizing ON/OFF GCs and 2) extract multiple parallel inputs to GCs. This approach is compared to LN analyses and demonstrated to offer several advantages.

Methods : Light responses of GCs in the ventral mouse retina were recorded using a 60-channel multi-electrode array mounted on an inverted microscope. UV light stimuli from a modified DLP projector were delivered through the objective. Stimuli included full-field square-waves, standard Gaussian white noise checkerboards, and spatially correlated (“cloud”) noise generated by low-pass filtering the Gaussian checkerboards. RFs of GCs were characterized by a separable NIM, which describes the responses of GCs as reflecting the integrated outputs of excitatory and suppressive input subunits; the performance of the NIM was compared to that of a standard LN model. All parameters were determined using maximum a posterior optimization.

Results : LN analysis yielded descriptions of ON and OFF RFs (not those of ON/OFF RFs). NIM analysis, however, revealed a significant number of ON/OFF GCs. And, we observed a number of instances in which GCs that appeared to be either ON or OFF cells from their responses to square waves were found to be true ON/OFF cells by NIM. The ON/OFF nature of GCs revealed by NIM analysis was verified experimentally by observations of responses to repeated stimuli.

Conclusions : NIM analysis of responses to a cloud stimulus captures the real identity of GCs, describing their spike output as arising from the integration of excitatory and suppressive inputs. Thus, it provides a novel and useful framework for the study of retinal circuits.

This is an abstract that was submitted for the 2017 ARVO Annual Meeting, held in Baltimore, MD, May 7-11, 2017.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×