Investigative Ophthalmology & Visual Science Cover Image for Volume 65, Issue 7
June 2024
Volume 65, Issue 7
Open Access
ARVO Annual Meeting Abstract  |   June 2024
Automated Classification of Choroidal Nevi using Data Augmentation and a Patch-Based Deep Learning Approach with Fundus Images
Author Affiliations & Notes
  • Mehregan Biglarbeiki
    Electrical and Software Eng, University of Calgary, Calgary, Alberta, Canada
  • Ezekiel Weis
    University of Alberta Faculty of Medicine & Dentistry, Edmonton, Alberta, Canada
  • Emad Mohammed
    Thompson Rivers University Faculty of Science, Kamloops, British Columbia, Canada
  • Roberto Souza
    University of Calgary, Calgary, Alberta, Canada
  • Behrouz Far
    University of Calgary, Calgary, Alberta, Canada
  • Trafford Crump
    University of Calgary Cumming School of Medicine, Calgary, Alberta, Canada
  • Footnotes
    Commercial Relationships   Mehregan Biglarbeiki None; Ezekiel Weis None; Emad Mohammed None; Roberto Souza None; Behrouz Far None; Trafford Crump None
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science June 2024, Vol.65, OD57. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Mehregan Biglarbeiki, Ezekiel Weis, Emad Mohammed, Roberto Souza, Behrouz Far, Trafford Crump; Automated Classification of Choroidal Nevi using Data Augmentation and a Patch-Based Deep Learning Approach with Fundus Images. Invest. Ophthalmol. Vis. Sci. 2024;65(7):OD57.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Using deep learning to automate the classification of colour fundus images based on the presence of choroidal nevi is challenging due to the limited availability of labelled data. To overcome this limitation, this study aims to test the performance of a patch-based deep learning model in completing this task.

Methods : This study uses 580 fundus images collected and labelled by the Alberta Ocular Brachytherapy Program as part of routine clinical practice. Half of the images were labelled as “Lesion” and the other half as “Normal”. A YOLOv8 classification model (using 40 training epochs, batch sizes of 16, and initial learning rates set at 0.001 for AdamW and 0.01 for SGD) was used in three experiments. In Experiment 1, the original full-size images (3918 x 3916 pixels) were resized to 600 x 600 pixels. Random augmentations of the images were used during training to improve the model’s generalizability. In Experiment 2, full-size images were resized to 3000 x 3000 pixels and then uniformly divided into 25 patches. Each patch was relabeled based on the presence of any portion of a nevus. To address the class imbalance issue caused by patching, data augmentations including changes in hue and brightness, along with random rotation, and flipping, were applied to patches labelled as “Lesion.” Augmentations from Experiment 1 were consistently applied during training. In Experiment 3, we applied the same augmentation methods used in Experiment 2 to select images with noise. We also randomly reduced the contrast of selected images. The same augmentations used in Experiments 1 and 2, were applied during training. In each experiment, performance was measured using accuracy, precision, and recall.

Results : Experiment 1 resulted in an accuracy of 85.2%, a precision of 83.0%, and a recall of 87.0%. Experiment 2 resulted in an accuracy of 90.3%, a precision of 91.4%, and a recall of 88.1%. Experiment 3 resulted in an accuracy of 92.6%, a precision of 93.8%, and a recall of 90.1%.

Conclusions : The YOLOv8 model with patch-specific augmentation targeting noise and contrast issues, generated the best results. This study demonstrates the adaptability of the YOLOv8 model for improved accuracy in challenging fundus image classification scenarios with limited labelled data.

This abstract was presented at the 2024 ARVO Annual Meeting, held in Seattle, WA, May 5-9, 2024.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×