July 2018
Volume 59, Issue 9
Open Access
ARVO Annual Meeting Abstract  |   July 2018
A Digital Prosthetic Eye Featuring Deep Neural Network Pupil Tracking
Author Affiliations & Notes
  • Emily Sarah Charlson
    Gavin Herbert Eye Institute, University of California Irvine, Costa Mesa, California, United States
  • Zonglin Guo
    Donald Bren School of Information and Computer Scientists, University of California Irvine, Irvine, California, United States
  • Ian Harris
    Donald Bren School of Information and Computer Scientists, University of California Irvine, Irvine, California, United States
  • Jeremiah Tao
    Gavin Herbert Eye Institute, University of California Irvine, Costa Mesa, California, United States
  • Footnotes
    Commercial Relationships   Emily Charlson, None; Zonglin Guo, None; Ian Harris, None; Jeremiah Tao, None
  • Footnotes
    Support  UC Irvine Institute for Clinical and Translational Science (ICTS) is funded by the National Institutes of Health (NIH) under the Clinical and Translational Sciences Award (CTSA) program
Investigative Ophthalmology & Visual Science July 2018, Vol.59, 4132. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Emily Sarah Charlson, Zonglin Guo, Ian Harris, Jeremiah Tao; A Digital Prosthetic Eye Featuring Deep Neural Network Pupil Tracking. Invest. Ophthalmol. Vis. Sci. 2018;59(9):4132.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : Current eye prosthetics incorporate photographic technology to render an image of a patient’s healthy eye onto a acrylic or glass shell. When looking forward, ocular prostheses replicate eye appearance well. However, on side gaze the prosthetic remains largely fixed while the healthy eye rotates to a new position contributing to an appearance of strabismus and often low self esteem among patients. Here, we hypothesize that digital microscreen technology coupled to novel deep neural network pupil tracking algorithms can create a dynamic, more realistic ocular prosthetic.

Methods : A microcamera embedded within glasses tracks the healthy eye. A deep neural network based pupil tracking algorithm identifies pupil location. Three features of pupil position are identified by separate algorithms including blob feature for location of the dark pupil/light reflex, edge feature for pupil edge with ellipse fitting, and motion feature to refine pupil location on data from two adjacent frames. Each algorithm is run, generating 3 candidate pupil locations which are then presented to a convolutional neural network. The network evaluates which one has the best quality and uses it as the final output. This information is wirelessly transmitted to a corresponding receiver housed within the digital prosthesis for display on a mini Organic Light Emitting Diode (OLED) screen.

Results : Our system captured and wirelessly transmitted healthy eye movements to a miniature digital prosthetic. Deep neural network pupil tracking algorithm identified pupil location with 4.87% increase in accuracy with 5 pixels error and 6.21% in accuracy with 10 pixels error over current state of the art approaches when tested against a database of pupil images. The network out preformed all others even in sub-ideal situations including bad pupil illumination, extreme eye rotation, and eyelid/lashes covering pupil image. A corresponding digital image of a eye was displayed on a mini OLED screen that moved in synch with the healthy eye, mimicking conjugate eye movements. The DPE was created within size and weight restrictions compatible for a wearable device.

Conclusions : The use of microcameras, novel pupil detection algorithms, and mini OLED screen allowed for the creation of the first ever digital prosthetic feasible for use in unilateral anophthalmia. Further miniaturization of prosthetic size and refinement of shape are in development.

This is an abstract that was submitted for the 2018 ARVO Annual Meeting, held in Honolulu, Hawaii, April 29 - May 3, 2018.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×