Investigative Ophthalmology & Visual Science Cover Image for Volume 65, Issue 7
June 2024
Volume 65, Issue 7
Open Access
ARVO Annual Meeting Abstract  |   June 2024
Testing and optimizing RGC type selective stimuli
Author Affiliations & Notes
  • Vyom Manish Raval
    Physiology & Biophysics, University of Washington, Seattle, Washington, United States
    Medical Scientist Training Program, University of Washington, Seattle, Washington, United States
  • Michael B Manookin
    Physiology & Biophysics, University of Washington, Seattle, Washington, United States
  • Fred Rieke
    Physiology & Biophysics, University of Washington, Seattle, Washington, United States
  • Footnotes
    Commercial Relationships   Vyom Raval None; Michael Manookin None; Fred Rieke None
  • Footnotes
    Support  NIH R01-EY027323, NIH R01-EY029247, NIH R01-EY028542
Investigative Ophthalmology & Visual Science June 2024, Vol.65, 2468. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Vyom Manish Raval, Michael B Manookin, Fred Rieke; Testing and optimizing RGC type selective stimuli. Invest. Ophthalmol. Vis. Sci. 2024;65(7):2468.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Different retinal ganglion cell (RGC) types provide parallel channels from the retina to the cortex whose selectively impaired functioning is implicated in many disorders including glaucoma and dyslexia. We lack stimuli proven to selectively probe RGC types. Such stimuli could improve diagnostic efficacy for diseases and help study the contribution of different retinal pathways to visual cortex neurons.

Our experimental study had two goals: (1) to test existing stimuli hypothesized to preferentially rely on specific RGC types, and (2) to use recent models for RGC encoding to design metameric stimulus pairs that selectively modulate one type's population responses.

Methods : We recorded spike responses of ex vivo peripheral primate retina using a multielectrode array. The pulsed pedestal and steady pedestal psychophysical paradigms (Pokorny and Smith 1997) were tested. The pedestal model’s contrast gain signatures were compared to measured midget (M) and parasol (P) white-noise nonlinearities. M and P population activity was decoded to predict contrast differences using regressor models.

Next, we used a model of M and P cells to simulate population responses to natural image inputs and optimize image pairs to match responses for one population and modulate responses for the other population (Freedland and Rieke 2022).

Results : The ratio of P to M responses for the pedestal model was 6.21 +/- 0.10 over typical stimulus contrasts. However, the ratio of the average Off parasol (n=87) and Off midget (n=254) white-noise nonlinearities over positive filtered contrast values was 1.26 +/- 1.72, much lower than the pedestal model. Further, six regression models on parasol and midget population responses to both pedestal stimuli revealed no significant differences in performance.

Image optimization for M metamers identified image pairs that were quite distinct (low linear correlations; r = -0.15) and were predicted to elicit nearly identical M responses (r = 1.00) and strongly different P responses (r = -0.29). Similar image pairs were identified for P metamers.

Conclusions : Peripheral recordings of M and P populations suggest that their contrast gain signatures and population predictive power for contrast differences may not be as distinct as hypothesized by the pedestal paradigm. Model-based image optimization offers a promising avenue for selective population metamers that can be validated in future electrophysiology and psychophysical experiments.

This abstract was presented at the 2024 ARVO Annual Meeting, held in Seattle, WA, May 5-9, 2024.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×