Abstract
Purpose :
To describe the rates of retinal nerve fiber layer (RNFL) loss in a large population of glaucoma patients and to understand how the accuracy of detecting certain rates of RNFL loss varies based on the frequency and temporal spacing on OCT exams.
Methods :
Measurements of RNFL thickness using Zeiss OCT were obtained at a tertiary glaucoma practice from glaucoma and glaucoma suspect patients who were 18 years or older from April 1st, 2013 to July 30th, 2021. Only eyes with at least 5 measurements of superior and inferior RNFL thickness ≥ 50 m and signal strength (SS) ≥ 6 were included in the analysis.
Linear regression was performed on data from each eye and population level rates of RNFL change were computed. Simulated data were then generated from the regression slopes and residuals. The i th simulated data point from the k th eye had the form yi,k = βk + εi,k where εi,k was randomly chosen from the residuals of all eyes whose rates of worsening were within 0.25 μm/yr of βk. For each eye, M data points were generated within time period T under two conditions: in the "evenly spread" condition, simulated data were evenly spread out over time; in the "clustered" condition they were clustered (half and half) at the endpoints. The percentage of cases where the true slope was at or below the observed slope was calculated for different M and T.
Results :
12,150 eyes from 7,392 patients satisfied our inclusion criteria. The mean (SD) time between measurements was 403 (180) days. Figure 1 shows the rates of worsening, SS, and baseline RNFL thickness of our sample. The median, 75th and 90th percentile rates of worsening are –0.39, –1.09, and –2.35 μm/yr; the mean (SD) SS is 8.0 (1.0); and the mean (SD) baseline RNFL thickness is 87.50 (32.08) μm. Figure 2 shows the percentage of cases (y-axis) where an observed rate of worsening (x-axis) came from an eye whose actual rate of worsening was at least as severe as the observed rate. Clustering measurements and expanding the time interval between measurements improved accuracy, with the lowest accuracy in the [–3, –2] μm/yr range.
Conclusions :
Our analysis shows that detecting rapid worsening (90th percentile) on OCT at 70% correct requires ~13 measurements clustered at the endpoints of a 2-year period. Current clinical practice is 1-2 OCT scans per year.
This abstract was presented at the 2022 ARVO Annual Meeting, held in Denver, CO, May 1-4, 2022, and virtually.