June 2022
Volume 63, Issue 7
Open Access
ARVO Annual Meeting Abstract  |   June 2022
Difference between fixation target and fixation point during convergence and divergence eye movements recorded using eye tracking
Author Affiliations & Notes
  • Shunya Tatara
    Department of Orthoptics and Visual sciences, Faculty of Medical Technology, Niigata University of Health and Welfare, Japan
    Department of Vision Science, Faculty of Sensory and Motor Control, Kitasato University Graduate School of Medical Science, Japan
  • Tomoya Handa
    Department of Vision Science, Faculty of Sensory and Motor Control, Kitasato University Graduate School of Medical Science, Japan
    Department of Rehabilitation, Orthoptics and Visual Science Course, School of Allied Health Sciences, Kitasato University, Japan
  • Fumiatsu Maeda
    Department of Orthoptics and Visual sciences, Faculty of Medical Technology, Niigata University of Health and Welfare, Japan
  • Footnotes
    Commercial Relationships   Shunya Tatara nac Image Technology, Code C (Consultant/Contractor); Tomoya Handa nac Image Technology, Code C (Consultant/Contractor); Fumiatsu Maeda None
  • Footnotes
    Support  JSPS KAKENHI
Investigative Ophthalmology & Visual Science June 2022, Vol.63, 2778 – A0313. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Shunya Tatara, Tomoya Handa, Fumiatsu Maeda; Difference between fixation target and fixation point during convergence and divergence eye movements recorded using eye tracking. Invest. Ophthalmol. Vis. Sci. 2022;63(7):2778 – A0313.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Convergence and divergence eye movements become asymmetrical movements of both eyes when they are included in saccadic eye movements. The deviation of the fixation point with respect to the fixation target at the time of convergence and divergence eye movements can be quantified using eye tracking. In this study, we aimed to investigate the difference between fixation target and fixation point resulting from the difference in the moving speed of the fixation target.

Methods : We recruited nine university students without abnormalities as our subjects. For the measurement, we used a novel eye tracking device (nac Image Technology, Tokyo, Japan). As the subjects looked into this device with both eyes, the fixation target moved closer and farther from it in a pseudo manner. The subjects repeated convergence and divergence eye movements for 20 s by looking at the fixation target that moved between 0.3 m and 1 m. The sampling rate of the device was 30 Hz, and the movement speeds of the fixation target were 2, 4, 6, 8, and 10 deg/s. We measured the fixation point and calculated the difference from the fixation target. The difference between the fixation point and fixation target at each movement speed of the fixation target was tested using the Friedman test. P-value <0.05 was considered statistically significant.

Results : The differences between the fixation point and fixation target (median [25th–75th percentiles]) were 119.2 [70.6–191.7], 135.8 [101.2–147.6], 147.2 [135.1–173.7], 179.6 [175.4–209.0], and 203.3 [198.7–216.8] mm at 2, 4, 6, 8, and 10 deg/s, respectively. Friedman test revealed that the difference between the fixation point and the fixation target increased as the movement speed of the fixation target increased (P < 0.001).

Conclusions : As the movement speed of the fixation target increased, differences between the fixation point and the fixation target in the convergence and divergence eye movements increased, suggesting that the asymmetry of both eyes increased.

This abstract was presented at the 2022 ARVO Annual Meeting, held in Denver, CO, May 1-4, 2022, and virtually.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×