This study demonstrates that 30 minutes of defocus-induced blur adaptation resulted in a reduction of CS at the lowest tested spatial frequency and enhancements at high spatial frequencies. Although the magnitude differed among the subjects, the demonstrated CS changes in six subjects suggested that the effects can be generalized to majority of the population with normal vision.
17
The observation of consistent enhancement in CS in the high-frequency range after blur adaptation is interesting and is in accordance with the observation of improved letter and grating acuity measures found in other studies (Rosenfield M, et al.
IOVS 2003;44:E-Abstract 4315; Rosenfield M, et al.
IOVS 2002;43:E-Abstract 1902).
1, 2–7 This enhancement effect eluded detection in other contrast sensitivity studies,
4,9 possibly because of critical differences in methodology introduced in this study. Although Mon-Williams et al.
4 reported improvements in letter acuity after 30 minutes of blur adaptation, potential CS improvements might not have been captured because top-up images were not used to avoid the decay of adaptation during lengthy CS measurements, a common but not universal practice.
9,12,18,19 Indeed, when we measured CS without blur adaptation top-up in our laboratory, enhancements were very difficult to prove (see
Fig. 3 and the legend for details). This idea is further supported by the concurrent finding of improved letter acuity,
4 which is very quick to measure so that adaptation decay effects do not arise. Although Webster
9 used calibrated natural images to top-up adaptation during CS measurements in his study, subjects were exposed to blur only for 5 minutes, which might not have been sufficient to accrue detectable high spatial frequency enhancement effects. The observation of significant improvements in letter acuity by Cufflin et al.
6 after 30 minutes, but not 10 or 20 minutes, of blur exposure with +1 D and +3 D defocus lenses adds further weight to this hypothesis.
In addition to the beneficial CS effects shown here resulting specifically from adaptation to blur, there have been reports of facilitation or enhancement of grating detection
18,19 and orientation discrimination
19,20 after specific adaptation to given spatial frequencies and orientations. Performance improvements shown in these studies were noted only when adapting and test targets were distinct along the dimensions of spatial frequency or orientation. For example, De Valois
18 noted a reduction in CS around the adapted frequency and enhanced CS approximately 3 octaves away from the adapted frequency. The effect fell to zero by approximately 1 octave from the adapted frequency, and there was little or no effect of CS between 1 and 2 octaves away from the adapted frequency. De Valois
18 attributed these effects to a general “inhibition plus fatigue” model, which would seem to predict the results found here. However, this study is different from older studies reporting benefits of adaptation since, unlike gratings of a particular frequency or orientation, natural images used here during adaptation contain a broad range of frequencies and orientations with an inverse spatial frequency amplitude spectrum fall-off.
21 Even though the defocus lens used in this study would further reduce the relative amplitude of high spatial frequency detail, the adapting stimuli nevertheless remain broadband. Furthermore, recent results from physiological experiments in awake primates have shown that the responses of cells in the primary visual cortex to broadband stimuli are not simply predictable from responses to individual gratings.
22 Nevertheless, adaptive spatial frequency selective enhancements of neural responses, at least for low spatial frequencies, have recently been shown using different types of broadband stimuli (white noise and natural images) in simple cells of the feline visual cortex.
23
Although it is tempting to speculate about differential effects between subjects of varying refractive status, the primary aim of this article is to document the presence of enhancement effects after blur adaptation regardless of refractive condition. A number of previous studies
3,5,6 demonstrated improvements in letter acuity after blur adaptation but failed to find differential effects between people with myopia and emmetropia. If improvements in letter acuity reflect enhanced CS at high frequencies, one might also expect no postadaptation differences in CS between refractive conditions. A recent study
24 that reported positive contrast-adaptation effects (as measured by interocular contrast matching of a suprathreshold 3.22 cpd grating) also failed to find significant differences between people with myopia and emmetropia. An interesting related finding of that study is the suggestion that contrast-adaptation effects were induced only by positive but not by negative defocus. Although, in contrast to our study, measurements were made without defocus lenses, no significant changes in detection threshold were found at 3.22 cpd (the only tested spatial frequency) for either people with myopia or emmetropia,
24 thus concurring with our own study. On the other hand, people with myopia (not emmetropia) displayed a greater improvement in grating acuity at low contrast levels after blur adaptation.
2,3 In a different study,
5 subjects with early-onset myopia displayed greater blur thresholds (a mean difference of ∼0.25 D) after blur adaptation than did subjects with emmetropia or late-onset myopia. In another blur sensitivity study not specifically addressing blur adaption,
25 people with myopia displayed greater (mean difference, 0.08 D) blur detection threshold than people with emmetropia. Exactly how differential blur sensitivity might affect post-blur adaptation CS is unknown, but the magnitude of defocus-induced blur in this study (+2 D) is significantly greater than the differences in subjective blur sensitivity noted between refractive groups, and we have no reason to expect different CS enhancements among people with myopia and emmetropia. Nevertheless, especially in light of the grating acuity differences between people with myopia and emmetropia,
2,3 it remains an intriguing idea to test this more fully in a future study.
It is interesting to consider our results alongside what is known about blur adaptation. As stated earlier, previous studies (Rosenfield M, et al.
IOVS 2002;43:E-Abstract 1902)
1,2,4,6,7 have ruled out the role of optical contributions toward improved letter or grating acuity after blur adaptation. Pupil size might be an important parameter regarding adaptation effects, but an analysis of our results failed to identify this as a significant factor. Similarly, perceptual learning has been implicitly ruled out by other investigators,
1,4 who found no improvements when blurred tests were performed after exposure to nonblurred adapting scenes, a result repeated in our own laboratory (data not shown). The observation here of a CS reduction, not improvement, at 0.5 cpd further adds weight that the results are not caused by simple learning effects. Having ruled out optical and learning contributions, the only remaining plausible explanation is that neural mechanisms are responsible for the phenomenon of blur adaptation; this is supported by the observation of interocular transfer of blur adaptation.
4 This study's findings of reduced CS at the lowest tested frequency and enhancements in the high-frequency range confirm that neural mechanisms play an active compensatory role in blur adaptation.
Supported by an Endeavour International Postgraduate Research Scholarship (NR) and a Science Faculty Scholarship from the University of Melbourne (NR), and a student allowance from the Department of Optometry and Vision Sciences of the University of Melbourne.
The authors thank Phillip Bedggood for his constructive comments on a previous version of this manuscript.