Automated vitreous haze grading using ultrawide-field fundus photographs and deep learning
Purpose
To evaluate the performance of a deep learning algorithm for the automated grading of vitritis on ultra-wide field imaging.
Methods
Ultra-wide field (UWF) fundus retinophotographs (Optos®, Dunfermline, United Kingdom) of patients followed for intermediate, posterior or pan-uveitis were used. Vitreous haze was defined on each UWF picture by two blinded experts according to the 6 steps of the Nussenblatt scale. Included images were automatically classified using the Inception V3 convolutional neural network from Google (California, USA). Images were fed to the model as follows: training (70%), validation (15%), and testing set (15%). The performance was assessed on the unused testing set.
Results
A total of 1181 images from 443 patients were included. The performance of the model for the detection of vitritis was good with a sensitivity of 96% (95% CI: 0.89-0.98) and a specificity of 87% (95% CI: 0.78-0.89). The area under the ROC curve was 0.92 (95% CI: 0.78-0.99). For the classification of vitritis into 6 classes, the global accuracy of the model was 64% increasing to 85% when accepting a maximum error margin of one class (95% CI: 0.76-0.94).
Conclusion
We describe a new deep learning model based on UWF fundus imaging that produces an automated tool for the detection and grading of vitreous haze with a good performance.
Conflict of interest
No
Authors 1
Last name
TOUHAMI
Initials of first name(s)
S
Department
Ophthalmology
City
Paris
Country
France
Authors 2
Last name
Mhibik
Initials of first name(s)
B
Department
Ophthalmology
City
Paris
Country
France
Authors 3
Last name
Bchir
Initials of first name(s)
C
Department
Biostatistics
City
Paris
Country
France
Authors 4
Last name
Gulic
Initials of first name(s)
K
Department
Ophthalmology
City
Paris
Country
France
Authors 5
Last name
Bodaghi
Initials of first name(s)
B
Department
Ophthalmology
City
Paris
Country
France
This website uses cookies to ensure you get the best experience on our website.
Learn more