September 2016
Volume 57, Issue 12
Open Access
ARVO Annual Meeting Abstract  |   September 2016
Smart Specs: Electronic vision enhancement in real-life scenarios
Author Affiliations & Notes
  • Joram Jacob Van Rheede
    Clinical Neurosciences, University of Oxford, Oxford, OXON, United Kingdom
  • Iain Robert Wilson
    Clinical Neurosciences, University of Oxford, Oxford, OXON, United Kingdom
  • Lori Di Bon-Conyers
    Royal National Institute for Blind People, London, United Kingdom
  • Sabine Croxford
    Royal National Institute for Blind People, London, United Kingdom
  • Robert E MacLaren
    Nuffield Laboratory of Ophthalmology, University of Oxford, Oxford, United Kingdom
  • Stephen Lloyd Hicks
    Clinical Neurosciences, University of Oxford, Oxford, OXON, United Kingdom
  • Footnotes
    Commercial Relationships   Joram Van Rheede, None; Iain Wilson, None; Lori Di Bon-Conyers, None; Sabine Croxford, None; Robert MacLaren, None; Stephen Hicks, None
  • Footnotes
    Support  Google Impact Challenge UK 2014
Investigative Ophthalmology & Visual Science September 2016, Vol.57, 5165. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Joram Jacob Van Rheede, Iain Robert Wilson, Lori Di Bon-Conyers, Sabine Croxford, Robert E MacLaren, Stephen Lloyd Hicks; Smart Specs: Electronic vision enhancement in real-life scenarios. Invest. Ophthalmol. Vis. Sci. 2016;57(12):5165.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose : With the advent of portable computational power, real-time processing of the visual image to provide sight enhancement for the visually impaired is becoming a feasible proposition. We have developed a set of electronic glasses with displays and cameras for this purpose and here report on a study that aims to assess their utility in a set of realistic scenarios (1st stage) as well as in a month of everyday use (2nd stage). Furthermore, we sought to investigate whether this approach was more suitable for certain sight conditions than others.

Methods : In collaboration with the Royal National Institute for Blind people (RNIB), over 300 participants were recruited after an initial screening process eliminating those with no light perception or those whose vision exceeded our threshold. 105 participants have now completed the first stage of testing. Participants' sight condition, visual acuity and visual field characteristics were recorded and they were scored by observers while completing 1) an object recognition and manipulation task and 2) a navigation task both with and without the electronic glasses. Participants could use 5 different modes of enhancement ranging from a distance-to-brightness mapping to a contrast boost.

Results : Out of 105 participants who completed stage 1 of the study to date, 27 were reported to benefit from the glasses by observers, and 43 self-reported finding the glasses useful. They were found particularly beneficial for participants with retinitis pigmentosa (RP; 14/29), with some success for people with cataracts (5/16). Participants with glaucoma (2/16) or macular degeneration (1/16) were unlikely to benefit from the glasses in their current form. Spared central vision in RP might underlie the promising results of this patient group, as in general participants with predominantly central (‘tunnel’) vision were most likely to benefit (15/32). Over 30 participants were deemed suitable for taking home the glasses for stage 2 of the study.

Conclusions : We found that our real-time electronic image enhancement has the potential to benefit a significant number of people with sight loss in real-life scenarios, in particular those with spared central vision. This work will be extended to include more people with different visual conditions and investigate the utility of the glasses over a month of home use, as well as a characterisation of which enhancements modes are most useful for different tasks.

This is an abstract that was submitted for the 2016 ARVO Annual Meeting, held in Seattle, Wash., May 1-5, 2016.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×