Abstract
Purpose :
As a foundation to develop tools for surgical guidance, we sought to develop an agent capable of autonomously identifying the various steps and phases of phacoemulsification cataract surgery in real-time together with pupil segmentation, such that the generated output is capable of informing surgical decision making.
Methods :
Heterogeneous videos were annotated by ophthalmic surgeons in order to achieve robustness of pupil detection, phase identification and tissue segmentation by the algorithm.
The application acquires video frames, in real time, from a surgical microscope-based video capture device. Then, a Region Based Convolutional Neural Network (R-CNN) performs the following functions for each analyzed frame (figure 1):
I) pupil location and area
ii) surgical phase identification according to the instruments in use
Results :
1. We evaluated the performance of the R-CNN via comparison with annotation of surgical videos performed by ophthalmic surgeons (Table 1). We achieved high values in accuracy, precision, and sensitivity across each of the four phases (idle, capsulorhexis, phacoemulsification and cortex removal), leading to F1-scores above 90%.
2. There was also strong correlation among the graders’ assessment of the size of the pupil with the pupil area detected by the algorithm, yielding precision, sensitivity, and intersection over union area (IoU) of 82.07%, 87.19%, and 95.14%, respectively.
3. The algorithm executed these tasks at an average processing speed of 82±20 frames per second (FPS), well above the output of of 60 FPS at which most contemporary surgical visualization systems display images.
Conclusions :
It is important to state that no current machine learning solution combines phase identification with pupil tracking. We have developed a platform that provides the foundations for a real-time surgical guidance tool for phacoemulsification cataract surgery by using object detection and classification for surgical phase classification. Future machine learning-based tools can utilize these capabilities for the creation of novel surgical guidance tools and feedback mechanisms.
This is a 2021 ARVO Annual Meeting abstract.