Abstract
Purpose :
Detecting the motion of an object within a visual scene is important for the survival of sighted animals. This task can be particularly difficult when the statistical properties of the object are similar to those of the background. The standard models for visual processing in primates posit that this type of motion is first extracted by cortical or collicular circuits, but work in non-primate species suggests that the retina may contribute significantly to this computation (Kastner & Baccus, 2013 Neuron). Here, we directly evaluated the ability of different populations of primate ganglion cells and displaced amacrine cells to detect the motion of an object relative to a background during simulated camouflage.
Methods :
We recorded the spike responses of populations of ganglion cells and displaced amacrine cells from peripheral primate retina (pigment epithelium attached) using a multielectrode array (Litke et al., 2004 IEEE Trans Nucl Sci). Spikes were detected using Kilosort (Steinmetz et al., 2021 Science). We employed a correlation-based approach that uses the relative positions and time-dependent firing rates of cells of a given type in the mosaic to determine the direction of motion (Frechette et al., 2005 J Neuroscience).
Results :
Several cell types accurately distinguished between rightward and leftward motion of the object relative to the background including On/Off parasol ganglion cells, On/Off smooth monostratified ganglion cells, small bistratified ganglion cells, broad thorny ganglion cells, and A1 amacrine cells (p < 0.05; Wilcoxon signed rank test). All of these cell types responded strongly when the object moved differentially relative to the background. Furthermore, after the object resumed its correlated motion with the background, population activity persisted near where the object blended in with the background.
Conclusions :
Our results demonstrate that several of the parallel pathways in the primate retina accurately detect object motion relative to the background, even under challenging conditions such as simulated camouflage. These findings further indicate that this key visual computation begins much earlier in the primate visual stream than classical models suggest.
This abstract was presented at the 2024 ARVO Annual Meeting, held in Seattle, WA, May 5-9, 2024.