April 2010
Volume 51, Issue 13
ARVO Annual Meeting Abstract  |   April 2010
An Artificial Neural Network for Classifying Artifacts in ERG Recordings
Author Affiliations & Notes
  • T. Strasser
    Institute for Ophthalmic Research,
    Centre for Ophthalmology, Tuebingen, Germany
  • R. Wilke
    Institute for Ophthalmic Research,
    Centre for Ophthalmology, Tuebingen, Germany
    Biomedical Engineering, University of New South Wales, Sydney, Australia
  • T. Peters
    Institute for Ophthalmic Research,
    Centre for Ophthalmology, Tuebingen, Germany
  • E. Zrenner
    Ophthalmic Research,
    Centre for Ophthalmology, Tuebingen, Germany
  • Footnotes
    Commercial Relationships  T. Strasser, None; R. Wilke, None; T. Peters, None; E. Zrenner, None.
  • Footnotes
    Support  Tistou und Charlotte Kerstan Foundation - Vision 2000
Investigative Ophthalmology & Visual Science April 2010, Vol.51, 1494. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      T. Strasser, R. Wilke, T. Peters, E. Zrenner; An Artificial Neural Network for Classifying Artifacts in ERG Recordings. Invest. Ophthalmol. Vis. Sci. 2010;51(13):1494.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose: : The precise interpretation of electrophysiological recordings may suffer from interference with artifacts of different sources. Removal of artifacts is not only a time consuming task, particularly in clinical trials, but also prone to introduction of observer bias even if carried out in a reading center. Therefore, here we propose an automated method for the removal of artifacts based on a decision support system employing an artificial neural network (ANN).

Methods: : For training of the ANN, sweeps of flicker ERG (9 Hz, 3 scot. cd/m², 2 kHz sample frequency, 444 ms) recorded during routine clinical examinations (8 patients, 38 visits) have been classified for artifacts by an experienced technician. Individual sweeps (6012) of recordings obtained from one eye and classified as either being contaminated by artifacts (2570), or as acceptable (3442) have been used. Recordings were down sampled and normalised to reduce the number of input neurons. The ANN was implemented as feed-forward multilayer perceptron (222/60/100/1 neurons) with backpropagation learning algorithm and sigmoid activation function using the Java Neural Network Environment (JOONE).The performance of the ANN was assessed using recordings of the partner eye. Additionally, a Fourier transformation was applied to the averaged sweeps and the significance of the amplitude of the 9 Hz component was evaluated using the method described by Meigen and Bach (Doc Ophthalmol. 2000;98:207-232). The ratio significant/not significant (0.1%) was compared in unclassified, manually classified, and ANN outcome.

Results: : The training of the ANN was stopped after 1000 cycles. At this point the root mean squared error (RMSE) of the ANN converged around 0.28. Sweeps of the partner eye were presented to the ANN for classification. Compared to manual classification the ANN results 3708 correct, 1154 false-positive, and 1150 false-negative classifications. The ratios of significant 9 Hz responses were 0.19 (unclassified), 0.27 (manually classified) and 0.27 (ANN).

Conclusions: : Even if the results of the classification are not perfect yet, it turned out that the ANN is a great help for pre-classifying recordings in an automated way and helps to save a lot of time. We will further work on optimizing structure and parameters of the ANN to increase the classification rate. As the presented approach can easily adapted for use with other stimuli including tests with drafted ISCEV protocols for clinical trials.

Keywords: electroretinography: non-clinical • electroretinography: clinical 

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.