June 2015
Volume 56, Issue 7
Free
ARVO Annual Meeting Abstract  |   June 2015
A Vibrotactile Sensory Substitution Guided Feedback System For Object Localization
Author Affiliations & Notes
  • Nii Tete Mante
    Biomedical Engineering, University of Southern California, Doheny Eye Inst, Los Angeles, CA
    Computer Science, University of Southern California, Los Angeles, CA
  • Gerard Medioni
    Computer Science, University of Southern California, Los Angeles, CA
  • Armand Tanguay
    Biomedical Engineering, University of Southern California, Doheny Eye Inst, Los Angeles, CA
    Electrical Engineering, University of Southern California, Los Angeles, CA
  • James D Weiland
    Biomedical Engineering, University of Southern California, Doheny Eye Inst, Los Angeles, CA
    Ophthalmology, University of Southern California, Los Angeles, CA
  • Footnotes
    Commercial Relationships Nii Tete Mante, None; Gerard Medioni, None; Armand Tanguay, University of Southern California (P); James Weiland, Second Sight Medical Products, Inc. (F)
  • Footnotes
    Support None
Investigative Ophthalmology & Visual Science June 2015, Vol.56, 4781. doi:
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Nii Tete Mante, Gerard Medioni, Armand Tanguay, James D Weiland; A Vibrotactile Sensory Substitution Guided Feedback System For Object Localization. Invest. Ophthalmol. Vis. Sci. 2015;56(7 ):4781.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
 
Purpose
 

The purpose of this study was to test our 'Object Localization and Tracking System' (OLTS), which assists visually impaired people with reaching and grasping for objects. The main goal was to explore how varying the visual angle of a computer generated feedback mechanism alters the accuracy of reaching and grasping in object localization tasks.

 
Methods
 

The OLTS utilized a wide­angle (~100 degrees) monocular camera (Tanguay, Sahin), a central processing unit (CPU) and cranially positioned vibration motors. Computer Vision algorithms (Context Tracker, Dinh & Medioni) on the CPU parsed, and processed the camera input to determine object(s) position. Vibration motors produced haptic feedback "codes" for the blind test subject based on object position. Objects positioned on the periphery of the camera's visual field cause the vibration motors to fire. Objects within the “central-­region” of the field of view cause the computer to stop vibration. Once the object was centralized within the cameras center of vision, test subjects were asked to reach out and touch the object. Eight blind test subjects evaluated the device. Five different “central­-region” visual angles of the feedback algorithm were used (7.8, 15.6, 23.4, 31.2 and 39 degrees); 10 experiments per angle were conducted. Each experiment consisted of localizing and grasping for an object.

 
Results
 

The results gathered included the average time to first grasp, average time to completion and average number of reaches for each of the five "central region" visual angles. The average times to first grasp for angles 7.8, 15.6, 23.4, 31.2 and 39 degrees were 15.87, 11.89, 12.70, 11.94 and 13.60 respectively. The average times to final grasp for angles 7.8, 15.6, 23.4, 31.2 and 39 degrees were 19.28, 13.74, 18.98, 18.12 and 20.29, respectively. The average number of reaches for the five angles were 1.38, 1.46, 1.61, 1.53 and 1.61, respectively.

 
Conclusions
 

The experiments conducted indicate subjects can accurately reach and grasp for object using the OLTS. Furthermore, experiments indicate that a visual angle of 15.6 degrees yields a fast localization and grasping time, as well as a low number reach attempts per localization task.  

 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×