Purchase this article with an account.
David J. Ramsey, Janet S. Sunness, Poorva Malviya, Gregory Hager, Carol Applegate, James T. Handa; Automated Image Alignment and Segmentation to Follow Progression of Geographic Atrophy in AMD. Invest. Ophthalmol. Vis. Sci. 2012;53(14):4086.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Progression of geographic atrophy (GA) associated with dry age-related macular degeneration is often monitored using color fundus and autofluorescence imaging. Quantification of atrophy in photographs by means of serial analysis of images differs between graders and imaging modalities. The purpose of this investigation was to develop a computer-based image segmentation method as a means of standardizing the quantification of GA in color and autofluorescence fundus photographs.
A longitudinal series of 30-degree color fundus photographs matched with autofluorescence images were used for analysis. An expert grader performed computerized planimetry to identify the area of GA in each photograph and these maps were used as ground truth for software development. Each pair of images was then aligned by i2k Align Retina, a commercially available alignment program and an identical alignment transformation performed on the corresponding GA maps. The results of manual grading were then compared to GA detected in each image by a custom analysis program using Fuzzy C Means (FCM), a class-based, soft segmentation method.
The amount of atrophy judged by manual grading of fundus autofluorescence images was 12±5% less than the atrophic area in matched color fundus photographs after image registration. The automated FCM-based segmentation detection of atrophic lesions in retinal images showed excellent agreement with an expert grader for autofluorescence images, but performed less well on color fundus photographs. Automated segmentation of autofluorescence images identified 96% of atrophy correctly with a 12% false-positive rate. In contrast, the results for color fundus photographs varied greatly with only 55% of atrophy correctly identified and a 14% false-positive rate.
Our findings suggest that color fundus photographs may overestimate the amount of retinal pigment epithelial atrophy compared to fundus autofluorescence imaging, in certain cases by a significant amount. This could have an impact on the calculated rate of GA progression. The novel automated segmentation method that we present is fast, reproducible, and achieves a level of accuracy in autofluorescence images that rivals expert graders. These methods could be helpful in planning future treatment trials for GA by helping to standardize the detection GA. Additional optimization of the automated software is needed to confidently identify GA in color fundus images.
This PDF is available to Subscribers Only