April 2014
Volume 55, Issue 13
ARVO Annual Meeting Abstract  |   April 2014
How could a fixation target be perceptually stabilized in the presence of fixational eye movements?
Author Affiliations & Notes
  • Frank Schaeffel
    Section Neurobiology of the Eye, Ophthalmic Research Institute, Tuebingen, Germany
  • Thomas Euler
    Centre for Integrative Neuroscience, Ophthalmic Research Institute, Tuebingen, Germany
  • Ziad Hafed
    Centre for integrative Neuroscience, University of Tuebingen, Tuebingen, Germany
  • Footnotes
    Commercial Relationships Frank Schaeffel, None; Thomas Euler, None; Ziad Hafed, None
  • Footnotes
    Support None
Investigative Ophthalmology & Visual Science April 2014, Vol.55, 778. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Frank Schaeffel, Thomas Euler, Ziad Hafed; How could a fixation target be perceptually stabilized in the presence of fixational eye movements?. Invest. Ophthalmol. Vis. Sci. 2014;55(13):778.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose: During fixation of a target miniature eye movements cause continuous spatial jitter of the image on the retina which is invisible. How retina or brain compensate for these movements is not known but one option is that jitter of the background is “subtracted” from the jitter of the fixated target. This hypothesis was tested.

Methods: Miniature eye movements were recorded in the right eyes of eight young adult subjects (left eye covered), using a custom-built video eye tracker that sampled at 90 Hz with an angular resolution of about 2 arcmin. Subjects were asked to fixate a target on a computer screen (a red cross subtending 1 deg of visual angle) under four different stimulation conditions (1) stationary fixation target and stationary background image (2) stationary fixation target but background image compensated for fixational eye movements (3) both fixation target and background compensated for fixational eye movements (4) fixation target compensated for fixational eye movements but background stationary. The background was filtered with a round low pass filtered aperture subtending 58.8 deg of visual angle to remove sharp edges which might provide references for stabilization. Standard deviations of angular positions of the fixation axis, as recorded over 3.5 sec, were analyzed.

Results: (1) Standard deviations ranged from 6.7 to 15.2 arcmin among the subjects (average 12.1 arcmin) when fixation target and background were stationary. (2) When the background moved with fixational eye movements, standard deviations of the angular positions of fixation remained similar, ranging from 6.3 to 17.1 arcmin (average 10.6 arcmin, n.s.). Conditions (3) and (4) caused large drifts in fixation. There were no differences between them (119.2 vs 133.4 arcmin, n.s., but standard deviations were highly significantly different from (1) and (2)).

Conclusions: The observed “diameters of the distributions of gaze during fixation” (Cherici et al, J Vision 2012) were similar to their study (average vector length 18 vs 20 arcmin) when the fixation target was stationary. Moving the fixation target alone with the eye or in coherence with the background opened the feedback loop for stable fixation and caused large drifts. There were no differences between conditions (1) and (2), making it unlikely that jitter in the background is subtracted from jitter of the fixation target to stabilize the image.

Keywords: 522 eye movements • 713 shape, form, contour, object perception • 719 scene perception  

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.