Purchase this article with an account.
Avi Caspi, Arup Roy, Varalakshmi Wuyyuru, Paul E. Rosendall, Jason W. Harper, Kapil D. Katyal, Michael P. Barry, Gislin Dagnelie, Robert J. Greenberg; Eye Movement Control in the Argus II Retinal-Prosthesis Enables Reduced Head Movement and Better Localization Precision. Invest. Ophthalmol. Vis. Sci. 2018;59(2):792-802. doi: 10.1167/iovs.17-22377.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Visual scanning by sighted individuals is done using eye and head movements. In contrast, scanning using the Argus II is solely done by head movement, since eye movements can introduce localization errors. Here, we tested if a scanning mode utilizing eye movements increases visual stability and reduces head movements in Argus II users.
Eye positions were measured in real-time and were used to shift the region of interest (ROI) that is sent to the implant within the wide field of view (FOV) of the scene camera. Participants were able to use combined eye-head scanning: shifting the camera by moving their head and shifting the ROI within the FOV by eye movement. Eight blind individuals implanted with the Argus II retinal prosthesis participated in the study. A white target appeared on a touchscreen monitor and the participants were instructed to report the location of the target by touching the monitor. We compared the spread of the responses, the time to complete the task, and the amount of head movements between combined eye-head and head-only scanning.
All participants benefited from the combined eye-head scanning mode. Better precision (i.e., narrower spread of the perceived location) was observed in six out of eight participants. Seven of eight participants were able to adopt a scanning strategy that enabled them to perform the task with significantly less head movement.
Integrating an eye tracker into the Argus II is feasible, reduces head movements in a seated localization task, and improves pointing precision.
This PDF is available to Subscribers Only