June 2015
Volume 56, Issue 7
Free
ARVO Annual Meeting Abstract  |   June 2015
Automated Construction of Arterial and Venous Trees on Retinal Images Using Topological and Intensity Information
Author Affiliations & Notes
  • Qiao Hu
    Electrical and Computer Engineering, Univ of Iowa, Iowa City, IA
  • Mona K Garvin
    Electrical and Computer Engineering, Univ of Iowa, Iowa City, IA
    Center for the Prevention and Treatment of Visual Loss, Veterans Administration Hospital, Iowa City, IA
  • Michael David Abramoff
    Electrical and Computer Engineering, Univ of Iowa, Iowa City, IA
    Ophthalmology & Visual Sciences, The University of Iowa Hospitals & Clinics, Iowa City, IA
  • Footnotes
    Commercial Relationships Qiao Hu, No. PCT/US2014/028055 (P); Mona Garvin, No. PCT/US2014/028055 (P); Michael Abramoff, IDx LLC (I), No. PCT/US2014/028055 (P)
  • Footnotes
    Support None
Investigative Ophthalmology & Visual Science June 2015, Vol.56, 5264. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Qiao Hu, Mona K Garvin, Michael David Abramoff; Automated Construction of Arterial and Venous Trees on Retinal Images Using Topological and Intensity Information. Invest. Ophthalmol. Vis. Sci. 2015;56(7 ):5264.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
 
Purpose
 

Automated retinal vessel analysis from fundus images, including central retinal artery and venous equivalent estimations, requires automated labelling of arteries and veins. We previously developed a graph-based framework to separate overlapping arterial-venous (A/V) trees (Hu et al., MICCAI 2013). Here we present a new version to determine arteries and veins in overlapping as well as non-overlapping vessels by including a pixel classification algorithm.

 
Methods
 

An expert annotated vessel pixels in the public dataset DRIVE (40 images from 40 subjects, equally divided into a training and a test set) as artery or vein (Fig. 1). Our approach first uses the topology of the vasculature to separate overlapping vessels into A/V trees with a graph-based algorithm. Then a support-vector-machine classifier (trained on the training set using 19 local intensity features) is used to classify independent vessels into A/V vessels. The approach is validated on the test set, with both manual and automatic vesselness maps as inputs. The coverage rate (ratio of classified vessel pixels over all vessel pixels defined in the A/V tree reference standard) and the accuracy (ratio of correctly classified vessel pixels over all classified vessel pixels) are used in the evaluation.

 
Results
 

An example result is shown in Fig. 2. The mean accuracy/coverage rate with 95% confidence interval was 88.2% (84.3, 92.1) / 88.5% (86.8, 90.2) for the manual segmentation; and 82.5% (77.4, 87.6) / 85.19% (83.7, 86.7) for the automatic segmentation. Using our previously developed approach, the mean accuracy/coverage rate with 95% CI was 89.06% (84.9, 93.3) / 82.03 (80.0, 84.1) for the manual segmentation; and 83.08% (77.6, 88.5) / 78.6% (77.0, 80.2) for the automatic segmentation. Thus the approach improves the coverage rate significantly with similar accuracy for both inputs.

 
Conclusions
 

Here we present a method to automatically construct the A/V trees in retinal images given a vessel segmentation. The test on a publicly available dataset and the comparison with our previous method demonstrate its better performance.  

 
Fig. 1 (a) A fundus image from DRIVE (b) Reference A/V trees (red=artery, blue=vein, green=overlapping, white=uncertain)
 
Fig. 1 (a) A fundus image from DRIVE (b) Reference A/V trees (red=artery, blue=vein, green=overlapping, white=uncertain)
 
 
Fig. 2 (a) Vesselness maps (b) Results using prior method (c) Results using proposed method
 
Fig. 2 (a) Vesselness maps (b) Results using prior method (c) Results using proposed method

 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×