June 2023
Volume 64, Issue 8
Open Access
ARVO Annual Meeting Abstract  |   June 2023
Training ophthalmologists on novel retinal characteristics uncovered using artificial intelligence
Author Affiliations & Notes
  • Gulcenur Ozturan
    Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
  • Lei Yuan
    Pharmacology, University of British Columbia, Vancouver, British Columbia, Canada
  • Ipek Oruc
    Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
    Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada
  • Footnotes
    Commercial Relationships   Gulcenur Ozturan None; Lei Yuan None; Ipek Oruc None
  • Footnotes
    Support  Natural Sciences and Engineering Research Council of Canada Discovery Grant RGPIN-2019-05554
Investigative Ophthalmology & Visual Science June 2023, Vol.64, 4994. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gulcenur Ozturan, Lei Yuan, Ipek Oruc; Training ophthalmologists on novel retinal characteristics uncovered using artificial intelligence. Invest. Ophthalmol. Vis. Sci. 2023;64(8):4994.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Retinal images are commonly used to diagnose and manage ocular diseases. The retina can also reveal clues regarding cardiovascular, neurological and systemic health status, though many are often overlooked, or perhaps presently unknown. In recent work, we proposed a methodology to extract novel retinal characteristics from a deep-learning model trained to classify fundus images. Using this, we uncovered previously unidentified retinal features that show differences between females and males, a patient trait that ophthalmologists do not currently recognize. Here, we examine whether human observers can learn to recognize patient sex in fundoscopic images.

Methods : We developed a training paradigm that consisted of didactic and practical components. In the didactic component, participants viewed a short presentation of descriptions of retinal characteristics that have been found to differ between males and females (brighter peripapillary region in females, greater vascular prominence in the superior temporal quadrant in males) and visuals of how these might present in fundoscopic images. In the practical component, participants completed up to 3 blocks of 50-trials of a sex-recognition task in which they chose the male image among male-female pairs in a two-alternative forced-choice (2-AFC) paradigm. Feedback was provided highlighting the correct choice. A separate block of 200 2-AFC trials without feedback was used to assess sex-recognition performance. This last test did not use any images seen in the practice component. Participants also completed a novel object memory test to assess general object recognition ability. 54 participants completed the study (M=34.2 years, 32 females, 23 ophthalmologists).

Results : Pre-training sex-recognition accuracy was M=0.52 consistent with chance-level performance and did not differ between ophthalmologists and the non-expert group (p>0.9). Post-training performance was significantly higher, M= 0.66 (d=1.89, p<<0.01; ophthalmologists M=0.66, d= 2.38, p<<0.01). Performance on the NOMT test was not related to improvement in fundus classification.

Conclusions : These results show that ophthalmologists do not recognize patient sex in fundus images, yet they can be trained to do so. Future work with this approach can be extended to discover and add novel signs of systemic and neurodegenerative disease in retinal images to the toolkit of ophthalmologists.

This abstract was presented at the 2023 ARVO Annual Meeting, held in New Orleans, LA, April 23-27, 2023.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×