July 2019
Volume 60, Issue 9
Open Access
ARVO Annual Meeting Abstract  |   July 2019
Extraction of Patient Specific Information from Fundus Images in the Wild
Author Affiliations & Notes
  • Marion Ronit Munk
    Inselspital, University Hospital Bern, Bern, Switzerland
    Ophthalmology, Northwestern University, feinberg school of medicine, Chicago, Illinois, United States
  • Thomas Kurmann
    ARTORG, University Clinic bern, Switzerland
  • Pablo Marquez-Neila
    ARTORG, University Clinic bern, Switzerland
  • Martin Sebastian Zinkernagel
    Inselspital, University Hospital Bern, Bern, Switzerland
  • Sebastian Wolf
    Inselspital, University Hospital Bern, Bern, Switzerland
  • Raphael Sznitman
    ARTORG, University Clinic bern, Switzerland
  • Footnotes
    Commercial Relationships   Marion Munk, Bayer (F), Gensight (C), Lumithera (C), Zeiss (C); Thomas Kurmann, None; Pablo Marquez-Neila, None; Martin Zinkernagel, bayer (F), Heidelberg (C), Novartis (C); Sebastian Wolf, Bayer (C), heidelberg (C), Novartis (C), Zeiss (C); Raphael Sznitman, None
  • Footnotes
    Support  none
Investigative Ophthalmology & Visual Science July 2019, Vol.60, 4803. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marion Ronit Munk, Thomas Kurmann, Pablo Marquez-Neila, Martin Sebastian Zinkernagel, Sebastian Wolf, Raphael Sznitman; Extraction of Patient Specific Information from Fundus Images in the Wild. Invest. Ophthalmol. Vis. Sci. 2019;60(9):4803.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose :
Fundus imaging is used for the diagnosis and treatment planning of a broad range of retinal pathologies. Among those pathologies is Diabetic Retinopathy whereby previous studies have shown that fundus images contain patient specific information such as gender, age and cardiovascular risk factors. Given the large spectrum of diseases typically found in a clinical setting, we ask the hypothesize that a machine learning based model is capable of extracting the gender of a patient's from fundus image acquired over a large and heterogeneous clinical population.

Methods : N=16196 patients (8180 female, 8016 male, aged 57.87 on average and with range 45 to 74) with a total of N=135690 fundus images were included. All images, irrespective of image quality, number of channels or device manufacturer were included. The dataset was divided into a training and test set, where the test set contained N=13730 images and the training N=121960, respectively. We trained an ensemble of 10 deep learning classifier to classify every fundus image as female or male. The classifier is based on a dilated residual network architecture which is pre-trained on Imagenet. We minimize the cross entropy loss with L2 regularization using Stochastic Gradient descent with early stopping. AUC operating characteristics curve, specificity, sensitivity and accuracy were assessed

Results : Our method achieved an AUC performance of 0.829. We state a sensitivity of 0.687 and specificity of 0.802 with an accuracy of 0.74. Randomly sampling 260 predictions and manually binning them into 4 classes: 145 correct predictions with visible fovea and optic disc; 61 incorrect predictions with visible fovea and optic disc; 25 correct prediction with non visible fovea and optic disc and 29 incorrect prediction with non visible fovea and optic disc. Thus non-random gender prediction is only possible if the fovea and optic disc are visible on fundus images.

Conclusions :
Deep learning methods can classify the gender of a patient’s from fundus images over a broad spectrum of patients and pathologies. This appears to be the case even when considering images irrespective of image quality, occlusion or any other degradation of the image. But it also highlights the difficulties when using fundus images acquired in daily clinical practice.

This abstract was presented at the 2019 ARVO Annual Meeting, held in Vancouver, Canada, April 28 - May 2, 2019.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×