Abstract
Purpose :
To describe the development of a deep convolutional neural network (CNN) for automated assessment of optic disc photograph quality
Methods :
With a code-free deep learning platform, a total of 2377 optic disc photographs were used to develop a deep CNN capable of determining optic disc photograph quality. Of these, 1002 were good quality images, 609 were acceptable quality, and 766 were poor quality, as determined by expert graders. The dataset was split 80/10/10 into training, validation, and test sets. Both a ternary classification model (good, acceptable, and poor quality) and a binary model (usable, unusable quality) were developed. The main outcome measures were overall accuracy and area under the receiver operating characteristic curve (AUC), which for the ternary model was calculated with a one vs. rest approach.
Results :
In the ternary classification system, the model had an accuracy of 91% and an AUC of 0.98. The model had higher predictive accuracy for images of good (93%) and poor quality (96%) than for images of acceptable quality (91%). The binary model had an accuracy of 98% and an AUC of 0.99. When validated on 292 images not included in the original training/validation/test dataset, the model’s accuracy was 85% on the ternary classification task and 97% on the binary classification task.
Conclusions :
The proposed system for automated image quality assessment for optic disc photographs achieves high accuracy with both ternary and binary classification systems. Our model highlights the success achievable with a code-free platform, which can significantly reduce in-house computational power and deep-learning expertise that is typically required when developing a neural network. There is wide clinical and research potential for such a model, with applications ranging from integration into fundus camera software to provide real-time feedback, to pre-screening large databases before their clinical evaluation or use in further research.
This abstract was presented at the 2023 ARVO Annual Meeting, held in New Orleans, LA, April 23-27, 2023.