We have developed a novel technique to explain results of an AI system and demonstrate its ability in explaining the EyeArt AI eye screening system for DR. Besides providing clinically consistent and accurate results, AI systems in healthcare must be explainable (be able to provide reasons for their decisions). This is paramount in making them trustworthy. The EyeArt system is intended for use by healthcare providers to screen for referable DR using color fundus photographs from 18 years or older diabetic patients who do not have persistent visual impairment. It is designed to provide screening results in under a minute and enable DR screening at point of care in endocrinology and primary care clinics. Figure shows that the explainability technique highlights image regions that most significantly contribute to the EyeArt system’s results. These highlighted regions correspond to lesions marked by an expert human grader. In a previous study on 107,001 consecutive patient visits, the EyeArt system has been shown to achieve sensitivity of 91.3% (95% CI: 90.9%-91.7%), specificity of 91.1% (95% CI: 90.9%-91.3%), NPV of 97.6% (95% CI: 97.5%-97.7%), and PPV of 72.5% (95% CI: 71.9%-73.0%). The work presented here provides evidence that the EyeArt system is explainable and has been designed to report DR screening results based on lesions considered important by the ophthalmology community for DR severity grading.

N. Parekh: Employee; Self; Eyenuk Inc. M. Bhaskaranand: Employee; Self; Eyenuk Inc. C. Ramachandra: Employee; Self; Eyenuk Inc. S. Bhat: Employee; Self; Eyenuk Inc. K. Solanki: Employee; Self; Eyenuk Inc. Stock/Shareholder; Self; Eyenuk Inc.


National Institutes of Health

Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered. More information is available at http://www.diabetesjournals.org/content/license.