June 2023
Volume 64, Issue 8
Open Access
ARVO Annual Meeting Abstract  |   June 2023
Object-background segregation in mouse retinal ganglion cells
Author Affiliations & Notes
  • Chenchen Cai
    Institute for Ophthalmic Research, Eberhard Karls Universitat Tubingen, Tubingen, Baden-Württemberg, Germany
    Graduate Training Centre of Neuroscience (GTC), Eberhard Karls Universitat Tubingen, Tubingen, Baden-Württemberg, Germany
  • Thomas Euler
    Institute for Ophthalmic Research, Eberhard Karls Universitat Tubingen, Tubingen, Baden-Württemberg, Germany
    Werner Reichardt Centre for Integrative Neuroscience, Eberhard Karls Universitat Tubingen, Tubingen, Baden-Württemberg, Germany
  • Katrin Franke
    Institute for Ophthalmic Research, Eberhard Karls Universitat Tubingen, Tubingen, Baden-Württemberg, Germany
    Department of Neuroscience, Baylor College of Medicine, Houston, Texas, United States
  • Footnotes
    Commercial Relationships   Chenchen Cai None; Thomas Euler None; Katrin Franke None
  • Footnotes
    Support  DFG, CRC 1233 “Robust Vision: Inference Principles and Neural Mechanisms”, project number 1215 276693517
Investigative Ophthalmology & Visual Science June 2023, Vol.64, 45. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Chenchen Cai, Thomas Euler, Katrin Franke; Object-background segregation in mouse retinal ganglion cells. Invest. Ophthalmol. Vis. Sci. 2023;64(8):45.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Eye movements result in substantial global image shifts on the retina. Nevertheless, the visual world is perceived stable and visual tasks are possible. Here, we study how retinal ganglion cell (RGCs) types of the mouse retina contribute to this stable visual perception by focusing on object-background segregation – a retinal computation that separates global from local motion.

Methods : We used a visual stimulus with three conditions that mimic local object, global background, and local object motion on a moving background (“object-background”). Object and background moved along random trajectories and consisted of gratings (n=6/9/746 mice/retinas/RGCs), checkerboards (n=4/6/624) or naturalistic “Perlin” noise in the background, with bright (n=2/3/183) or dark objects (n=2/3/182). We tested a range of stimulus velocities with ecological relevance using a local motion stimulus (7°/s - 58.1°/s) (n=3/6/650). Then, we recorded the population activity of mouse RGCs to these stimuli using two-photon calcium imaging of the ex vivo mouse retina. We used a set of artificial stimuli to assign the recorded RGCs to previously identified functional types. For statistical testing, we used two-sided paired Wilcoxon signed-rank test with Bonferroni correction.

Results : Using the grating stimulus, we found that most mouse RGCs prefer object (p = 8.22e-38) and object-background (p = 6.81e-40) over background motion, in line with previous results. This preference was consistent across different patterns tested, suggesting that this computation does not depend on specific stimulus statistics. We also found that most responsive RGCs (83.4%) preferred lower over faster object velocities. Based on these results, we used a checkerboard stimulus with a mean velocity of ~10°/s (n=4/7/509) to investigate how different RGC types process object motion during eye movements. This revealed that most RGC types (24/31) we recorded now contain object-background selective cells. Currently, we are investigating whether this feature is enriched in specific RGC types, which might signal differential local and global motion to the brain.

Conclusions : Our results demonstrate that object-background segregation is a feature shared across many retinal output channels in mice and will reveal how global motion during eye movements is compensated for by the retina in a type-specific manner before transmission to downstream visual areas.

This abstract was presented at the 2023 ARVO Annual Meeting, held in New Orleans, LA, April 23-27, 2023.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×