Abstract
Purpose :
To segment geographic atrophy (GA) lesions on spectral-domain optical coherence tomography (SD-OCT) images with deep learning trained with weak labels in the form of atrophy lesion size. In addition, we compare the proposed approach with existing fully pixel-level supervised methods.
Methods :
We constructed a convolutional neural network for GA segmentation from 3D OCT volumes. The task of the network is to regress the area of the GA expressed as a percentage of the field of view occupied by the GA. The CNN follows the U-Net architecture and consists of residual blocks. The final output layer has a sigmoid activation followed by the mean operation. The neural network learns to highlight patterns on the image by finding the relationship between the provided weak label and the desired segmentation mask. We compare the performance of our method against a supervised version of our algorithm and against semi-supervised deep-voting model.
Results :
We analyzed 392 SD-OCT scans taken at different time points from a follow-up of 37 patients, 70 eyes in total, with GA at baseline. We split patients in the dataset into three parts for training, validation, and testing with rate of 0.68/0.12/0.2 respectively. The split was made with stratification by study center and lesion size. The GA was successfully segmented solely with the provided lesion size as a label. The proposed method achieved a Dice score (DSC) of 0.62 on a validation dataset. At the same time, fully supervised version achieved DSC of 0.78 and a semi-supervised version a DSC of 0.5.
Conclusions :
We proposed a weakly-supervised geographic atrophy segmentation algorithm operating on SD-OCT that utilizes lesion size information during the training and compared this approach to the existing ones. The algorithm is able to learn the concept of GA from a regression task alone, without the help of human-annotated pixel masks. The employed method is expected to offer more interpretable deep learning approaches in medicine.
This is a 2020 ARVO Annual Meeting abstract.