Investigative Ophthalmology & Visual Science Cover Image for Volume 60, Issue 9
July 2019
Volume 60, Issue 9
Open Access
ARVO Annual Meeting Abstract  |   July 2019
Microaneurysm detection in retinal fundus images using deep convolutional U-net with focal loss objective function.
Author Affiliations & Notes
  • Jakob Holm Andersen
    The Maersk Mc-Kinney Moller Institute, Odense M, Denmark
    Steno Diabtes Center Odense, Odense, Denmark
  • Jakob Grauslund
    Department of Ophthalmology, Odense University Hospital, Odense, Denmark
    Steno Diabtes Center Odense, Odense, Denmark
  • Thiusius Rajeeth Savarimuthu
    The Maersk Mc-Kinney Moller Institute, Odense M, Denmark
  • Footnotes
    Commercial Relationships   Jakob Andersen, None; Jakob Grauslund, None; Thiusius Savarimuthu, None
  • Footnotes
    Support  SDCOs forskningspuljer A2665
Investigative Ophthalmology & Visual Science July 2019, Vol.60, 1520. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jakob Holm Andersen, Jakob Grauslund, Thiusius Rajeeth Savarimuthu; Microaneurysm detection in retinal fundus images using deep convolutional U-net with focal loss objective function.. Invest. Ophthalmol. Vis. Sci. 2019;60(9):1520.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Given that microaneurysms (MAs), the earliest lesion in diabetic retinopathy (DR), independently predict sight threatening stages of DR, early diagnosis is important to identify patients at risk. Recently, artificial intelligence and deep neural network (DNN) algorithms have been studied extensively for computer assisted diagnosis of diabetes related eye diseases. While these algorithms seem able to recognize macroscopic signs of disease, they often overlook more subtle, microscopic indications, like MAs. This results in good performance for detection of more severe cases, while performance for early-stage disease detection remains relatively low. The purpose of this study was to investigate the U-net DNN equipped with a specialized objective function for detection of retinal MAs in DR.

Methods : We trained the DNN on single field 45-degree retinal images (n=109) from a public database with pixel level annotations for MAs. Image were preprocessed and the dataset size increased using common data augmentation methods. We used an objective function referred to as the focal loss, which encourages the algorithm to focus its attention at hard to detect features such as MAs. The DNN was validated on test images from the same database (n=40). Another test set (n=50) was obtained from a second public database. Performance was assessed using the free response receiver operating characteristic (FROC) score, calculated as the mean sensitivity at seven average false positive per image (FPI) values of 0.125, 0.250, 0.500, 1, 2, 4 and 8 respectively.

Results : The DNN achieved a FROC score of 0.482 on the first set of test images and a sensitivity of 0.550 at a suggested 1.08 FPI operating point. On the second test set, the network achieved a FROC score of 0.228 with a sensitivity of 0.221 at 1.08 FPI, both of which are higher than previously reported results of using DNN algorithms on this particular dataset.

Conclusions : We show that a DNN trained using the focal loss objective function achieves results comparable to other automatic methods for detection of MAs. Additionally, the results indicate that our method generalizes well when evaluated on images taken from databases different than the one used for training. This is important to consider, as generalizability affects the applicability of DNNs for tasks such as MA detection in real life settings.

This abstract was presented at the 2019 ARVO Annual Meeting, held in Vancouver, Canada, April 28 - May 2, 2019.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×