September 2016
Volume 57, Issue 12
Open Access
ARVO Annual Meeting Abstract  |   September 2016
Reducing variability of perimetric global indices from eyes with progressive glaucoma by censoring unreliable sensitivity data
Author Affiliations & Notes
  • Manoj Pathak
    Mathematics and Statistics, Murray State University, Murray, Kentucky, United States
  • Shaban Demirel
    Legacy Research Institute, Portland , Oregon, United States
  • Stuart Keith Gardiner
    Legacy Research Institute, Portland, Oregon, United States
  • Footnotes
    Commercial Relationships   Manoj Pathak, None; Shaban Demirel, None; Stuart Gardiner, None
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science September 2016, Vol.57, 3920. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Manoj Pathak, Shaban Demirel, Stuart Keith Gardiner; Reducing variability of perimetric global indices from eyes with progressive glaucoma by censoring unreliable sensitivity data. Invest. Ophthalmol. Vis. Sci. 2016;57(12):3920.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose : It has long been known that perimetric test-retest variability increases with the level of visual field damage. Recent evidence suggests that increasing contrast all the way to 0dB is excessive, since response rates approach an asymptotic maximum well before this level. Censoring low sensitivities at some cutoff higher than the current limit of 0dB may therefore reduce variability, without compromising the ability to monitor progression. This study examines whether raising the floor for pointwise sensitivities affects the ability of global indices to detect change.

Methods : Longitudinal data from 106 eyes of 53 individuals with progressive glaucoma were used. Only eyes with significant negative MD rate of change were included. Pointwise sensitivities were censored at different dB minima, and Mean Deviation (MD) were recalculated using these censored sensitivities, called censored MD (CMD). The longitudinal signal to noise ratio (LSNR) for MD and CMD were calculated. These LSNRs were compared using Wilcoxon matched pairs tests.

Results : The average MD was - 6.2dB (range: - 29.92 to +5.12). The average MD rate of chance was -0.48 dB/yr (range: -3.74 to -0.08 dB/yr). Censored MD appeared to provide significantly better LSNRs than uncensored MD when using any pointwise sensitivity cutoff between 16dB and 20dB (Table 1).

Conclusions : This study suggests that 16-20dB could be a more suitable endpoint for perimetric testing algorithms than continuing testing down to 0dB. Pointwise visual field sensitivities below 16dB did not appear to improve the ability of MD to monitor change in glaucomatous visual fields.

This is an abstract that was submitted for the 2016 ARVO Annual Meeting, held in Seattle, Wash., May 1-5, 2016.



This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.