June 2020
Volume 61, Issue 7
Free
ARVO Annual Meeting Abstract  |   June 2020
The use of a handheld marker to calibrate a head-mounted eye tracker for visual prostheses
Author Affiliations & Notes
  • Avi Caspi
    Jerusalem College of Technology, Jerusalem, Israel
    Second Sight Medical Products, Inc., Sylmar, California, United States
  • Arup Roy
    Second Sight Medical Products, Inc., Sylmar, California, United States
  • Michael P Barry
    Second Sight Medical Products, Inc., Sylmar, California, United States
  • Roksana Sadeghi
    Wilmer Eye Institute, Maryland, United States
  • Arathy Kartha
    Wilmer Eye Institute, Maryland, United States
  • Gislin Dagnelie
    Wilmer Eye Institute, Maryland, United States
  • Footnotes
    Commercial Relationships   Avi Caspi, Second Sight Medical Products (C), Second Sight Medical Products (P); Arup Roy, Second Sight Medical Products (E), Second Sight Medical Products (P); Michael Barry, Second Sight Medical Products (E); Roksana Sadeghi, None; Arathy Kartha, None; Gislin Dagnelie, Second Sight Medical Products (C), Second Sight Medical Products (P)
  • Footnotes
    Support  NA
Investigative Ophthalmology & Visual Science June 2020, Vol.61, 926. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Avi Caspi, Arup Roy, Michael P Barry, Roksana Sadeghi, Arathy Kartha, Gislin Dagnelie; The use of a handheld marker to calibrate a head-mounted eye tracker for visual prostheses. Invest. Ophthalmol. Vis. Sci. 2020;61(7):926.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : The Argus II retinal prosthesis is an approved treatment to restore sight and has been implanted in blind patients worldwide. Previously we showed that a scanning mode based on patient’s eye movements improves pointing accuracy and reduces the need for head movements. Here we tested the feasibility of calibrating the eye-tracker based on pupil position and the location of the percept reported by a handheld marker.

Methods : Pupil positions were extracted using custom image processing in a field-programmable-gate-array built into a glasses-mounted eye tracker. In the calibration process, electrodes were directly stimulated and the subject (implant recipient) reported the location of the percept using a handheld marker. The glasses-mounted scene camera captured the location of the handheld marker, i.e. the percept’s location. Subjects shifted their gaze in different directions to record the perceived location at various eye positons. Linear regression was used to extract the transfer function from pupil position to gaze in the coordinates of the scene camera. The transfer function was subsequently used to shift, in real time, the region of interest (ROI) sent to the implant within the wide field-of-view (FOV) of the scene camera Shifting the scene camera by moving the head and shifting the ROI within the camera FOV by moving the eyes enabled eye-head scanning. Three Argus II implantees participated in the study.

Results : In the combined eye-head scanning mode, all subjects demonstrated better precision (p<0.01) on a localization task and a reduction of head movements. This mode was based on the linear transformation from pupil location to gaze coordinates extracted from the position of the handheld marker.

Conclusions : Pupil position was used to enable an eye-head scanning mode that improved functionality of the Argus II system. This methodology will be used to implement efficient calibration in a mobile head-mounted eye tracker for retinal and cortical visual prostheses.

This is a 2020 ARVO Annual Meeting abstract.

 

Percept location captured by the scene camera as a function of pupil position while stimulating the same retinal electrodes. Linear regression was used to transform pupil location to gaze, i.e. to calibrate the eye tracker.

Percept location captured by the scene camera as a function of pupil position while stimulating the same retinal electrodes. Linear regression was used to transform pupil location to gaze, i.e. to calibrate the eye tracker.

 

Pointing location during localization task for a target on a touch monitor. Left: eye-head scanning mode based on linear transformation from pupil locations to gaze coordinates. Right: head-only scanning mode.

Pointing location during localization task for a target on a touch monitor. Left: eye-head scanning mode based on linear transformation from pupil locations to gaze coordinates. Right: head-only scanning mode.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×