Purchase this article with an account.
Emily Sarah Charlson, Zonglin Guo, Ian Harris, Jeremiah Tao; A Digital Prosthetic Eye Featuring Deep Neural Network Pupil Tracking. Invest. Ophthalmol. Vis. Sci. 2018;59(9):4132.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Current eye prosthetics incorporate photographic technology to render an image of a patient’s healthy eye onto a acrylic or glass shell. When looking forward, ocular prostheses replicate eye appearance well. However, on side gaze the prosthetic remains largely fixed while the healthy eye rotates to a new position contributing to an appearance of strabismus and often low self esteem among patients. Here, we hypothesize that digital microscreen technology coupled to novel deep neural network pupil tracking algorithms can create a dynamic, more realistic ocular prosthetic.
A microcamera embedded within glasses tracks the healthy eye. A deep neural network based pupil tracking algorithm identifies pupil location. Three features of pupil position are identified by separate algorithms including blob feature for location of the dark pupil/light reflex, edge feature for pupil edge with ellipse fitting, and motion feature to refine pupil location on data from two adjacent frames. Each algorithm is run, generating 3 candidate pupil locations which are then presented to a convolutional neural network. The network evaluates which one has the best quality and uses it as the final output. This information is wirelessly transmitted to a corresponding receiver housed within the digital prosthesis for display on a mini Organic Light Emitting Diode (OLED) screen.
Our system captured and wirelessly transmitted healthy eye movements to a miniature digital prosthetic. Deep neural network pupil tracking algorithm identified pupil location with 4.87% increase in accuracy with 5 pixels error and 6.21% in accuracy with 10 pixels error over current state of the art approaches when tested against a database of pupil images. The network out preformed all others even in sub-ideal situations including bad pupil illumination, extreme eye rotation, and eyelid/lashes covering pupil image. A corresponding digital image of a eye was displayed on a mini OLED screen that moved in synch with the healthy eye, mimicking conjugate eye movements. The DPE was created within size and weight restrictions compatible for a wearable device.
The use of microcameras, novel pupil detection algorithms, and mini OLED screen allowed for the creation of the first ever digital prosthetic feasible for use in unilateral anophthalmia. Further miniaturization of prosthetic size and refinement of shape are in development.
This is an abstract that was submitted for the 2018 ARVO Annual Meeting, held in Honolulu, Hawaii, April 29 - May 3, 2018.
This PDF is available to Subscribers Only