April 2009
Volume 50, Issue 13
Free
ARVO Annual Meeting Abstract  |   April 2009
Cluster Computation of Frequency Domain Optical Coherence Tomography and Retinal Segmentation for Measurement of Intrinsic Optic Signal
Author Affiliations & Notes
  • A. R. Tumlinson
    School of Optometry and Vision Sciences, Cardiff University, Cardiff, United Kingdom
  • B. Hermann
    School of Optometry and Vision Sciences, Cardiff University, Cardiff, United Kingdom
  • B. Hofer
    School of Optometry and Vision Sciences, Cardiff University, Cardiff, United Kingdom
  • J. Osborne
    School of Optometry and Vision Sciences, Cardiff University, Cardiff, United Kingdom
  • B. Povazay
    School of Optometry and Vision Sciences, Cardiff University, Cardiff, United Kingdom
  • W. Drexler
    School of Optometry and Vision Sciences, Cardiff University, Cardiff, United Kingdom
  • Footnotes
    Commercial Relationships  A.R. Tumlinson, Zeiss Meditec., C; B. Hermann, None; B. Hofer, None; J. Osborne, None; B. Povazay, None; W. Drexler, Zeiss Meditec, C.
  • Footnotes
    Support  public (Cardiff University, FP6-IST-NMP-2 NANOUB (017128), Action Medical Research (AP1110), DTI (1544C), FP7 FunOCT) and commercial sources (FEMTOLASERS GmbH, Carl Zeiss Meditec Inc., Multiwave)
Investigative Ophthalmology & Visual Science April 2009, Vol.50, 1387. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      A. R. Tumlinson, B. Hermann, B. Hofer, J. Osborne, B. Povazay, W. Drexler; Cluster Computation of Frequency Domain Optical Coherence Tomography and Retinal Segmentation for Measurement of Intrinsic Optic Signal. Invest. Ophthalmol. Vis. Sci. 2009;50(13):1387.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose: : Experiments to measure intrinsic optical signals with optical coherence tomography generate large volumes of data requiring many hours to post process. Development of post process algorithms requires a flexible and comfortable programming interface. We maintain our existing Matlab programming environment and port the computation to a cluster of under-utilized office computers.

Methods: : To measure slow changes during dark adaptation, 40 high density (512 * 256 pts) volume tomograms were recorded before a bleaching stimulus, and 70 similar tomograms were recorded during the following 30 minute recovery period. Each test generates ~30 GB of spectral data to be translated to structural information in a post-processing step. Segmentation of intraretinal layers, to isolate stimulus induced changes to a particular tissue type takes a similar amount of time. Processing each volume requires about 30 minutes on a typical high-end desktop computer. Serially processing the entire dataset requires nearly 24 hours. Two methods of exploiting existing desktop computing power to process the multiple volumes in parallel were examined. A first system uses only Microsoft filesharing to distribute and collect workload on a small office network. The system uses compiled Matlab code distributed to the workers so that each processor on each computer individually processes a file generated by the acquisition computer. A total of 10 processors on one quad core and three dual core machines were used. A second system used Matlab code to interface and monitor a Condor Cluster job management system. The Condor system connected 45 worker computers to job profiles created by the local Matlab script.

Results: : Calculation time for the small network system was reduced by a factor of ~10 while the Condor system reduced computation by a factor of ~25. The small, homemade management system had the advantage of reduced overhead on workers, but was less robust.

Conclusions: : Cluster computing provides a low cost computing option with dramatic impact. Ideally, the time to compute the entire dataset is reduced to nearly the time required to compute a single volume. In our experience the factor of improvement reduced compute times from days to hours, rendering previously daunting tasks manageable. Using Matlab may be slower than some lower level development environments but provides a platform of understandable code that members of the entire lab can contribute to. Compiling that code to distribute to worker computers on a cluster topography allows for a very low cost method of implementing parallel computing.

Keywords: image processing • imaging methods (CT, FA, ICG, MRI, OCT, RTA, SLO, ultrasound) • retina 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×