May 2006
Volume 47, Issue 13
Free
ARVO Annual Meeting Abstract  |   May 2006
Automatic Evaluation of Fundus Image Quality
Author Affiliations & Notes
  • G.H. Halldorsson
    University of Iceland, Reykjavik, Iceland
    Electrical and Computer Engineering,
  • S.R. Joelsson
    University of Iceland, Reykjavik, Iceland
    Electrical and Computer Engineering,
  • R.A. Karlsson
    University of Iceland, Reykjavik, Iceland
    Electrical and Computer Engineering,
  • J.A. Benediktsson
    University of Iceland, Reykjavik, Iceland
    Electrical and Computer Engineering,
  • S.H. Hardarson
    University of Iceland, Reykjavik, Iceland
    Ophthalmology,
  • T. Eysteinsson
    University of Iceland, Reykjavik, Iceland
    Ophthalmology,
  • J.M. Beach
    Institute for Technology Development, Stennis Space Center, MA
  • A. Harris
    School of Medicine, Indiana University, Indianapolis, IN
  • E. Stefansson
    University of Iceland, Reykjavik, Iceland
    Ophthalmology,
  • Footnotes
    Commercial Relationships  G.H. Halldorsson, Oxymap ehf., E; S.R. Joelsson, Oxymap ehf., E; R.A. Karlsson, Oxymap ehf., E; J.A. Benediktsson, Oxymap ehf., I; Oxymap ehf., E; S.H. Hardarson, Oxymap ehf., E; T. Eysteinsson, Oxymap ehf., I; Oxymap ehf., E; J.M. Beach, Oxymap ehf., I; A. Harris, None; E. Stefansson, Oxymap ehf., I; Oxymap ehf., E.
  • Footnotes
    Support  Icelandic Research Council
Investigative Ophthalmology & Visual Science May 2006, Vol.47, 5651. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      G.H. Halldorsson, S.R. Joelsson, R.A. Karlsson, J.A. Benediktsson, S.H. Hardarson, T. Eysteinsson, J.M. Beach, A. Harris, E. Stefansson; Automatic Evaluation of Fundus Image Quality . Invest. Ophthalmol. Vis. Sci. 2006;47(13):5651.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
 
Purpose:
 

To develop a computer procedure to automatically estimate the quality of retinal images. The quality indicators would be used to accept or reject images for automatic analysis. A high quality image is expected to yield more reliable and accurate results from measurements based on the images.

 
Methods:
 

The structural similarity between a degraded version and the original image is used to derive a quality measure. Degradation is done by quantizing the wavelet coefficients of the image (i.e. lossy compression). Thus a poor similarity indicates more original quality. The loss of similarity between the original and degraded image is measured using a structural similarity (SSIM) index which takes into account changes in luminance, contrast and structure. This SSIM index has been shown to be in accordance with human grading of image degradation [Zhou Wang et al. IEEE Transactions on Image Processing, vol. 13, no. 4, 2004]. 300 datasets were extracted from a set of 56 retinal images graded by human observers. These datasets were evaluated by classifying the images into low and high quality categories and compared to quality grades assigned to the images by human observers.

 
Results:
 

Using the derived quality, or quality related features, the 56 images could be classified (by regression trees and threshold) into low and high quality classes. The classification agrees with the human observers with classification accuracy of 87–98% (leave–one–out estimation). The best results where obtained from a histogram of the SSIM map derived using a Haar wavelet and quantization using 0.075 of the highest value in wavelet space. The figure shows the agreement of the automatic method with human observers.

 
Conclusions:
 

We have developed an automatic procedure that can be used to measure image quality. The results are in good accordance with human graders.  

 
Keywords: image processing • imaging/image analysis: clinical • retina 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×