June 2021
Volume 62, Issue 8
Open Access
ARVO Annual Meeting Abstract  |   June 2021
Analysis of agreement of clinical experts to B-scan of interest algorithm
Author Affiliations & Notes
  • Mahsa Darvishzadeh Varcheie
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Katherine Makedonsky
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Gary C Lee
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Vidya Kulkarni
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Hugang Ren
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Aaron Lech
    ClearVue Eye Care, Roseville, California, United States
  • Laura Tracewell
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Rishi P Singh
    Center for Ophthalmic Bioinformatics, Cleveland Clinic Cole Eye Institute, Cleveland, Ohio, United States
  • Katherine Talcott
    Center for Ophthalmic Bioinformatics, Cleveland Clinic Cole Eye Institute, Cleveland, Ohio, United States
  • Niranchana Manivannan
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Mary K Durbin
    Carl Zeiss Meditec Inc, Dublin, California, United States
  • Footnotes
    Commercial Relationships   Mahsa Darvishzadeh Varcheie, Carl Zeiss Meditec (E); Katherine Makedonsky, Carl Zeiss Meditec (E); Gary Lee, Carl Zeiss Meditec (E); Vidya Kulkarni, Carl Zeiss Meditec (E); Hugang Ren, Carl Zeiss Meditec (E); Aaron Lech, Carl Zeiss Meditec (C), Carl Zeiss Meditec (F); Laura Tracewell, Carl Zeiss Meditec (C); Rishi Singh, Aerie (F), Alcon (C), Apellis (F), Bausch and Lomb (C), Genentech (C), Graybug (F), Gyroscope (C), Novartis (C), Regeneron (C); Katherine Talcott, Carl Zeiss Meditec (F), Genentech (C), Roche (C); Niranchana Manivannan, Carl Zeiss Meditec (E); Mary Durbin, Carl Zeiss Meditec (E)
  • Footnotes
    Support  None
Investigative Ophthalmology & Visual Science June 2021, Vol.62, 2450. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Mahsa Darvishzadeh Varcheie, Katherine Makedonsky, Gary C Lee, Vidya Kulkarni, Hugang Ren, Aaron Lech, Laura Tracewell, Rishi P Singh, Katherine Talcott, Niranchana Manivannan, Mary K Durbin; Analysis of agreement of clinical experts to B-scan of interest algorithm. Invest. Ophthalmol. Vis. Sci. 2021;62(8):2450.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Optical coherence tomography (OCT) scans are important for clinical evaluation of the retina, but manual evaluation of each B-scan is time consuming and requires significant expertise. To improve the workflow, a B-scan of interest tool (BSOI) was created using deep learning to indicate B-scans of interest as well as B-scans of poor image quality (IQ). The purpose of this study was to determine if the BSOI tool agreed with doctors’ clinical judgement.

Methods : Macular cube 512x128 scans were acquired using CIRRUS™ 6000 (ZEISS, Dublin, CA) on 47 subjects with various pathologies including macular degeneration, diabetic eye disease, epiretinal membranes, macular hole and central serous retinopathy. The BSOI tool first runs an IQ algorithm which flags B-scans with poor IQ then further flags good IQ scans if it finds one of the retinal pathologies as referenced in [1]. The prototype tool was integrated into the macular thickness analysis report, flagging B-scans of interest with a red indicator and poor IQ scans with a yellow indicator.
5 independent expert graders (2 retina specialists (RS) and 3 optometrists (OD)) evaluated the tool results and used a feedback feature to indicate if they agreed (thumbs up) or disagreed (thumbs down) (Fig 1). Intraclass correlation coefficient (ICC) and pair-wise Cohen’s kappa between graders were used to assess inter-grader consensus. Graders also answered a survey rating the clinical utility and loading time of the tool.

Results : Mean agreement of an individual grader with the tool was 79% and majority vote agreement between graders and the tool was 81%. Table 1 shows agreement (and 95% confidence interval) of each grader and survey answers. The ICC was 0.44 (0.32, 0.60) indicating poor reliability of agreement between all doctors. Kappas showed fair to moderate agreement between pairwise doctors.

Conclusions : This tool successfully flags B-scans that may have otherwise been missed by doctors. When evaluating agreement between graders on individual cases, differences in inter-clinician's grades were observed, even between retinal specialists. The mean agreement between the B-scan of interest algorithm and the graders were considered clinically acceptable. All graders agreed that the tool increases efficiency with appropriate loading time, can be useful for diagnoses and can be used by both technicians as well as clinicians.
References:
[1] Yu et al. IOVS 2020; 61(9): Abstract PB0085.

This is a 2021 ARVO Annual Meeting abstract.

 

 

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×