Abstract
Purpose :
Fundus images obtained using handheld cameras have been observed to have a wide variation in their image quality.Since the images are obtained by operators having different levels of experience and under vastly different operating conditions,the image quality generated by the same hardware can be vastly different.Therefore it was desired to develop an automated and objective image quality metric which can grade the input images in real time,allowing the operator to rescan the patient in case image quality was bad.Here we present such a tool by training a set of classifiers in grading a fundus image using relevant features from the image.The trained classifier then automatically classifies an input fundus image as clinically gradable or not.
Methods :
We used 1888 color fundus images taken with VISUSCOUT® 100 (ZEISS, Jena, Germany) handheld fundus camera as the training set.These images were manually classified by experts as gradable/non-gradable.Standard data augmentation techniques such as rotation & mirroring were used to increase the sample size.We used a separate dataset with 2407 gradable & 130 non-gradable images as the test set.
Some of the relevant criteria which define image quality are image sharpness/focus,signal-to-noise,optic disc localizability & having other non-fundus input.Using the training dataset,six features were extracted from each fundus image: optic disc present/absent,extent of upper vessel coverage,extent of lower vessel coverage,light leakage present/absent,brightness & illumination variance.
This set of six image features determined for the training images were then used for training four classifiers:Random Forest,Decision Tree,K Nearest Neighbors & SVM.The testing was then done on the test dataset with known image quality classification and the sensitivity/specificity numbers were calculated for the four classifiers mentioned above.
Results :
The sensitivity/specificity results for the four classifiers are given in Table1.
Conclusions :
The presented classifiers trained on the relevant fundus image features showed good accuracy in classifying VISUSCOUT 100 images as gradable/non-gradable in an automated manner.They could be used as a pre-filter for a fundus image based remote screening program allowing for better clinical outcomes.
This is an abstract that was submitted for the 2017 ARVO Annual Meeting, held in Baltimore, MD, May 7-11, 2017.