Abstract
Purpose :
It has long been known that perimetric test-retest variability increases with the level of visual field damage. Recent evidence suggests that increasing contrast all the way to 0dB is excessive, since response rates approach an asymptotic maximum well before this level. Censoring low sensitivities at some cutoff higher than the current limit of 0dB may therefore reduce variability, without compromising the ability to monitor progression. This study examines whether raising the floor for pointwise sensitivities affects the ability of global indices to detect change.
Methods :
Longitudinal data from 106 eyes of 53 individuals with progressive glaucoma were used. Only eyes with significant negative MD rate of change were included. Pointwise sensitivities were censored at different dB minima, and Mean Deviation (MD) were recalculated using these censored sensitivities, called censored MD (CMD). The longitudinal signal to noise ratio (LSNR) for MD and CMD were calculated. These LSNRs were compared using Wilcoxon matched pairs tests.
Results :
The average MD was - 6.2dB (range: - 29.92 to +5.12). The average MD rate of chance was -0.48 dB/yr (range: -3.74 to -0.08 dB/yr). Censored MD appeared to provide significantly better LSNRs than uncensored MD when using any pointwise sensitivity cutoff between 16dB and 20dB (Table 1).
Conclusions :
This study suggests that 16-20dB could be a more suitable endpoint for perimetric testing algorithms than continuing testing down to 0dB. Pointwise visual field sensitivities below 16dB did not appear to improve the ability of MD to monitor change in glaucomatous visual fields.
This is an abstract that was submitted for the 2016 ARVO Annual Meeting, held in Seattle, Wash., May 1-5, 2016.