Abstract
Purpose :
To develop a list of everyday items based on the Common Objects in Context (COCO) dataset, that are highly useful among people with native and artificial ultra-low vision (ULV) so that successful computer vision algorithms can be developed for object identification.
Methods :
10 participants with ULV (VA 20/1600) and 5 Argus II users were given 150 items from the COCO dataset and asked to rate the objects as very useful, useful, neutral, not useful, definitely not useful and the scores were 2, 1, 0, -1, -2 respectively. They were also asked about their preferences for system notifications (visual, speech, sound and haptic) for passive (general scanning) and active modes (scanning for a specific object).
Results :
Participants with native ULV and Argus II users rated more than 50% of the items on the dataset as useful and very useful. The objects that were reported to be most useful were similar in both groups, the top 5 useful objects reported that would benefit from using an object finder device were phone, empty seat, remote control, person and toilet. The category that was rated the highest was person and the category with the lowest rating was animal (Fig 1). All participants preferred speech and sound notifications compared to visual or haptic notifications for both passive and active modes irrespective of their level of residual vision (between 1.4 and 3.5 log MAR).
Conclusions :
We developed a priority list of useful objects from everyday life based on the COCO dataset among people with native ULV and Argus II users. This list of objects will be used to develop an object finder system using machine learning algorithms for people with ULV. Using the subjective ratings from this study, performance measures will be developed to determine the effect of an object finder system on functional performance and accessibility in people with ULV and visual prosthesis. Future studies will compare the effect of different modalities of notifications on functional performance in people with ULV and visual prosthesis.
This is a 2021 ARVO Annual Meeting abstract.