Abstract
Purpose :
We previously showed that maintained spike rates of many neurons in the intercalated (koniocellular, K) layers of the lateral geniculate nucleus (LGN) are inversely related to delta frequency-band power of local field potentials in primary visual cortex [Cheong et al., PNAS 35, 2011, 14659-14663]. Here we asked whether this brain-state induced variation changes the neurometric detection probability in K cells. We hypothesised that high maintained spike rates would decrease the signal-to-noise ratio and reduce stimulus detectability.
Methods :
Extracellular spike activity of K cells (n=48) was recorded in Sufentanil-anaesthetised marmosets (Callithrix jacchus). Visual stimuli (200 ms) high-contrast (50-80%) cone-isolating pulses and/or 5Hz variable contrast drifting gratings) were presented on a uniform grey background (~50 Cd / m^2). Receiver Operator Characteristic (ROC) analyses was applied to spike activity before and during the stimulus.
Results :
ROC analysis of responses to pulses and high-contrast gratings showed no statistical difference in area under the curve (AUC) for preferred stimuli when maintained spike rate before stimulus onset was high (above 20 spikes/s) or low (e.g. for pulsed stimuli AUC 0.94±0.08 vs 0.91±0.11, p=0.25, Wilcoxon rank sum test). For low-contrast (<14%) gratings the AUC was greater if spike rate was high during 350 ms before stimulus onset (high: 0.77, low: 0.64, data pooled across 36 cells).
Conclusions :
Contrary to our hypothesis, we found that high maintained spike rates can improve detectability of low-contrast stimuli. This improvement may arise because small modulations around high spike rates are not susceptible to rectifying distortions. Further, high maintained spike rates do not reduce the detectability of high contrast visual stimuli by K cells.
This is an abstract that was submitted for the 2016 ARVO Annual Meeting, held in Seattle, Wash., May 1-5, 2016.