Purchase this article with an account.
Lisha Deng, Shaban Demirel, Stuart K. Gardiner; Reducing Variability in Visual Field Assessment for Glaucoma Through Filtering That Combines Structural and Functional Information. Invest. Ophthalmol. Vis. Sci. 2014;55(7):4593-4602. doi: 10.1167/iovs.13-13813.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
To reduce variability and improve measurements of true change signal in visual field (VF) assessments through the use of filters that combine functional and structural test results.
Humphrey VF data (Swedish Interactive Thresholding Algorithm [SITA] Standard, 24-2) and confocal scanning laser ophthalmoscopy (Heidelberg Retina Tomograph [HRT]) data from 1057 eyes of 637 participants were used to derive a filter. Another dataset, consisting of VF and HRT data from 112 eyes of 62 participants each with ≥5 visits, was used to test the filter. At each VF location per eye, the trend over time was modeled by a linear model (LM), and a nonlinear model (NLM), using filtered or unfiltered data, but with the last visit excluded. The SD of residuals from the trends, and prediction errors (PE) for the last visit were compared between filtered and unfiltered data. The filter was reconstructed and analyses were repeated after truncating VF data so that thresholds < 19 dB were replaced by 19 dB to reduce noise.
The SD of the residuals at all 52 VF locations for all analyses was reduced by filtering (P < 0.001). The PE was reduced by filtering at 43 and 47 VF locations (P < 0.05) for LM analyses on observed and truncated data, and all 52 VF locations (P < 0.05) for both NLM analyses. Truncating data before filtering reduced variability (P < 0.01) at 41 and 40 VF locations for LM and NLM analyses.
Filtering can reduce variability about trends in longitudinal sequences of VF data, and improves the accuracy of predicting the next test result.
This PDF is available to Subscribers Only