Diabetic peripheral neuropathy (DPN) is characterized by pain and sensory loss, affecting approximately 50% of patients (1). Early identification and risk factor management are key to limiting progression of DPN. In contrast to retinopathy (retinal fundus imaging) and nephropathy (microalbuminuria) with early disease detection, the 10-g monofilament identifies advanced DPN. Corneal confocal microscopy (CCM) is an ophthalmic imaging technique that identifies subclinical corneal nerve loss, which predicts incident DPN (2) and has good diagnostic utility for DPN (3). It also identifies corneal nerve regeneration prior to improvement in symptoms and nerve conduction studies after simultaneous pancreas and kidney transplantation (4). CCM studies have primarily used manual corneal nerve analysis (CCMetrics), which, although highly reliable, is time-consuming with limited scalability.

Here, we combine a deep learning (DL) algorithm for fully automated quantification of corneal nerves in CCM images along with an adaptive neuro-fuzzy inference system (ANFIS) to rapidly differentiate patients with (DPN+) and without (DPN−) neuropathy and healthy control subjects. Participants with type 1 diabetes (n = 87) and control subjects (n = 21) underwent detailed assessment of neuropathy (Table 1). Based on the Toronto criteria, which combine symptoms, signs, and abnormal nerve conduction, patients were subdivided into DPN+ (29%) and DPN− (71%) groups (Table 1). Participants underwent CCM, and 6–8 central corneal nerve images/subject were quantified using our established methodology (5). DL performance was assessed against ground truth derived from gold-standard 1) manual CCMetrics and 2) automated (ACCMetrics) methods to quantify corneal nerve fiber length (CNFL).

Table 1

Demographic and clinical characteristics based on Toronto classification of DPN

Value by patient type
VariablesControlsDPN–DPN+P value
n 21 62 25  
Age (years) 44.9 ± 11.1 41.4 ± 14.8 62 ± 10.8a,b <0.0001 
Diabetes duration (years)  25.9 ± 18.9 45 ± 11.2b <0.0001 
HbA1c (mmol/mol) 36.1 ± 4.8 67.2 ± 16.1a 65.0 ± 15.1a <0.0001 
HbA1c (%) 5.4 ± 0.4 8.3 ± 1.7a 8.1 ± 1.8a <0.0001 
BMI (kg/m227.4 ± 4.2 26.1 ± 4.7 27.3 ± 4 NS 
BP systolic (mmHg) 127.1 ± 21.3 131.6 ± 18 147 ± 26.2a,b 0.007–0.004 
BP diastolic (mmHg) 72.7 ± 9.0 70.6 ± 8.9 73.6 ± 10.2 NS 
ACR (mg/mmol) 0.7 ± 0.5 2.4 ± 6.4 11.9 ± 17.7a,b 0.0008–0.0007 
eGFR (mL/min/L) 84.3 ± 8.4 86.6 ± 10 62.3 ± 28.1a,b <0.0001 
TC (mmol/L) 5.2 ± 0.7 4.4 ± 0.8a 4.4 ± 0.9a 0.005–0.0008 
Triglycerides (mmol/L) 1.4 ± 0.6 1.1 ± 0.7 1.3 ± 0.7 NS 
NSP 0.4 ± 1.1 2 ± 3.3 7.9 ± 6.7a,b <0.0001 
NDS 0.3 ± 0.5 1.4 ± 1.7a 7.1 ± 2a,b 0.01–0.0001 
VPT (V) 5 ± 3.3 8.1 ± 6 29.5 ± 11.8a,b <0.0001 
SSNA (μV) 22.6 ± 10.1 12.3 ± 6.9a 4.2 ± 2.9a 0.0006–0.0001 
SSNCV (m/s) 51.2 ± 5.3 45.6 ± 4.4a 40 ± 5.2a,b 0.0002–0.0001 
PMNA (μV) 5.7 ± 2.1 5.6 ± 7.7 1.5 ± 1.2b 0.02 
PMNCV (m/s) 48.5 ± 4.3 43.6 ± 3.3a 36.1 ± 6.1a,b <0.0001 
CNFLCCMetrics (pixels) 1,849 ± 213.5 1,311 ± 326.9a 934.2 ± 479.8a,b <0.0001 
CNFLDL (pixels) 1,756 ± 188.2 1,264 ± 319.7a 872 ± 468a,b <0.0001 
CNFLACCMetrics (pixels) 1,492 ± 168.4 1,021 ± 279.9a 686.7 ± 356.3a,b <0.0001 
Value by patient type
VariablesControlsDPN–DPN+P value
n 21 62 25  
Age (years) 44.9 ± 11.1 41.4 ± 14.8 62 ± 10.8a,b <0.0001 
Diabetes duration (years)  25.9 ± 18.9 45 ± 11.2b <0.0001 
HbA1c (mmol/mol) 36.1 ± 4.8 67.2 ± 16.1a 65.0 ± 15.1a <0.0001 
HbA1c (%) 5.4 ± 0.4 8.3 ± 1.7a 8.1 ± 1.8a <0.0001 
BMI (kg/m227.4 ± 4.2 26.1 ± 4.7 27.3 ± 4 NS 
BP systolic (mmHg) 127.1 ± 21.3 131.6 ± 18 147 ± 26.2a,b 0.007–0.004 
BP diastolic (mmHg) 72.7 ± 9.0 70.6 ± 8.9 73.6 ± 10.2 NS 
ACR (mg/mmol) 0.7 ± 0.5 2.4 ± 6.4 11.9 ± 17.7a,b 0.0008–0.0007 
eGFR (mL/min/L) 84.3 ± 8.4 86.6 ± 10 62.3 ± 28.1a,b <0.0001 
TC (mmol/L) 5.2 ± 0.7 4.4 ± 0.8a 4.4 ± 0.9a 0.005–0.0008 
Triglycerides (mmol/L) 1.4 ± 0.6 1.1 ± 0.7 1.3 ± 0.7 NS 
NSP 0.4 ± 1.1 2 ± 3.3 7.9 ± 6.7a,b <0.0001 
NDS 0.3 ± 0.5 1.4 ± 1.7a 7.1 ± 2a,b 0.01–0.0001 
VPT (V) 5 ± 3.3 8.1 ± 6 29.5 ± 11.8a,b <0.0001 
SSNA (μV) 22.6 ± 10.1 12.3 ± 6.9a 4.2 ± 2.9a 0.0006–0.0001 
SSNCV (m/s) 51.2 ± 5.3 45.6 ± 4.4a 40 ± 5.2a,b 0.0002–0.0001 
PMNA (μV) 5.7 ± 2.1 5.6 ± 7.7 1.5 ± 1.2b 0.02 
PMNCV (m/s) 48.5 ± 4.3 43.6 ± 3.3a 36.1 ± 6.1a,b <0.0001 
CNFLCCMetrics (pixels) 1,849 ± 213.5 1,311 ± 326.9a 934.2 ± 479.8a,b <0.0001 
CNFLDL (pixels) 1,756 ± 188.2 1,264 ± 319.7a 872 ± 468a,b <0.0001 
CNFLACCMetrics (pixels) 1,492 ± 168.4 1,021 ± 279.9a 686.7 ± 356.3a,b <0.0001 

Data are expressed as mean ± SD. ACR, albumin-to-creatinine ratio; BP, blood pressure; eGFR, estimated glomerular filtration rate; NDS, neuropathy disability score; NSP, neuropathy symptom profile; PMNA, peroneal motor nerve amplitude; PMNCV, peroneal motor nerve conduction velocity; SSNA, sural sensory nerve amplitude; SSNCV, sural sensory nerve conduction velocity; TC, total cholesterol; VPT, vibration perception threshold.

a

Significantly different from control subjects.

b

Significantly different from DPN− patients.

The DL algorithm uses a U-Net–based convolutional neural network, which requires smaller training sets for more precise segmentation. The algorithm was initially trained on 25% (n = 174 images) of the data set (affecting approximately 50% epochs/0.0001 learning rate) for 30 h (Intel Core i3–6100) and then validated on the remaining images (n = 534). To optimize quality, input training images were cropped to 256 × 256 pixels, binarized, and skeletonized, resulting in a segmented output image, where nerve pixels corresponded to a value of 1 and nonnerve pixels to 0.

DL-estimated CNFL was comparable to CCMetrics (1,326 ± 459 vs. 1,269 ± 444 pixels, P > 0.05) and significantly outperformed ACCMetrics (1,036 ± 385 pixels, P < 0.0001), with higher pixel detection sensitivity (85% vs. 70%) and lower false negative rate (15% vs. 30%) than ground truth. The intraclass correlation coefficient indicated excellent reproducibility for DL segmentation (0.98, P = 0.0001) and lower but high reproducibility for ACCMetrics (0.85, P = 0.0001).

ANFIS harnesses the learning ability of convolutional neural networks and the logic-based reasoning of fuzzy systems and was deployed to classify participants into DPN+, DPN−, or healthy control subjects (Fig. 1A and B). A multiclass model for ANFIS was constructed using “one versus all” and “one versus one” classifiers. The first classifier was trained to differentiate control (labeled 0) from DPN− (labeled 1) and DPN+ (labeled 2) participants using a one-versus-all approach. The second classifier was trained to differentiate DPN− from DPN+ participants using a one-versus-one approach. Βoth classifiers were trained at 20 epochs (step size, 0.1; step increase size, 1.01). Based on CCM images, ANFIS classified 43% of participants as DPN+ with excellent reliability (Cohen κ = 0.86, P < 0.0001). Receiver operating characteristic curve analysis showed the following capacity for discriminating 1) DPN− from control subjects: area under the curve (AUC) = 0.86 (95% CI 0.77–0.94, P < 0.0001) with 84% sensitivity/71% specificity; 2) DPN− from DPN+: AUC = 0.95 (95% CI 0.91–0.99, P < 0.0001) with 92% sensitivity/80% specificity; and 3) control subjects from DPN+: AUC = 1.0 (95% CI 0.99–1.0, P < 0.0001) with 100% sensitivity/95% specificity (Fig. 1C and D). Model size analysis showed that the combined segmentation (DL)–classification (ANFIS) system occupied 38 megabytes.

Figure 1

A: The hierarchical ANFIS classification model. CCM images from the training subset are fed into an initial neuro-fuzzy inference system (INFIS), and the results are evaluated on the first validation data set. The best-performing cutoff points are determined by subsequent fuzzy inference systems (FIS), and performance of the final system is evaluated on the second validation set. B: The hierarchical ANFIS prediction workflow is then applied to classify DPN based on the extent of CNFL loss. C: Representative CCM images from a healthy control and DPN− and DPN+ patients. D: Receiver operating characteristic curve analysis for identification of DPN by ANFIS based on DL segmented CCM images.

Figure 1

A: The hierarchical ANFIS classification model. CCM images from the training subset are fed into an initial neuro-fuzzy inference system (INFIS), and the results are evaluated on the first validation data set. The best-performing cutoff points are determined by subsequent fuzzy inference systems (FIS), and performance of the final system is evaluated on the second validation set. B: The hierarchical ANFIS prediction workflow is then applied to classify DPN based on the extent of CNFL loss. C: Representative CCM images from a healthy control and DPN− and DPN+ patients. D: Receiver operating characteristic curve analysis for identification of DPN by ANFIS based on DL segmented CCM images.

Close modal

We show that automated DL image segmentation has excellent agreement with manual expert analysis for the quantification of corneal nerves. This is important, as rapid and accurate automated quantification is key for successful deployment of CCM in large-scale, multicenter studies and clinical trials of disease-modifying therapies in DPN (4). Furthermore, we show that DPN classification with ANFIS was highly accurate for discriminating control from DPN− and particularly DPN+ subjects, surpassing previous diagnostic outcomes (3). Indeed, the ANFIS model identified corneal nerve loss in patients deemed to have no DPN based on the Toronto criteria, which relies on nerve conduction and misses early small nerve fiber involvement (5). Patients with DPN were older, and given that increasing age is associated with corneal nerve loss, that may have contributed to the outcomes of this study. In conclusion, this artificial intelligence model could discriminate patients with DPN from control subjects with almost perfect diagnostic outcomes, indicating the considerable potential of CCM in screening for DPN.

T.S. and I.N.P. are joint first authors. R.A.M. and U.A.Q. are joint senior authors.

Acknowledgments. M. Tavakoli (University of Exeter) undertook corneal confocal microscopy and H. Fadavi (Imperial College) undertook neuropathy assessments and quantitative sensory testing in a portion of study participants.

Funding. This research was funded by awards from the National Institutes of Health (R105991) and the JDRF (27-2008-362).

Duality of Interest. No potential conflicts of interest relevant to this article were reported.

Author Contributions. T.S. and I.N.P. prepared the first draft. T.S., I.N.P., M.F., G.P., O.A., U.A., and Z.R.M. acquired and analyzed data. All authors assisted in interpretation of results. S.K., N.E., R.A.M., and U.A.Q. critically revised the manuscript and approved the final version of the article. R.A.M. and U.A.Q. conceptualized and designed the study and supervised the analysis. R.A.M. and U.A.Q. are the guarantors of this work and, as such, had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

1.
Pop-Busui
R
,
Boulton
AJ
,
Feldman
EL
, et al
.
Diabetic neuropathy: a position statement by the American Diabetes Association
.
Diabetes Care
2017
;
40
:
136
154
2.
Pritchard
N
,
Edwards
K
,
Russell
AW
,
Perkins
BA
,
Malik
RA
,
Efron
N
.
Corneal confocal microscopy predicts 4-year incident peripheral neuropathy in type 1 diabetes
.
Diabetes Care
2015
;
38
:
671
675
3.
Perkins
BA
,
Lovblom
LE
,
Bril
V
, et al
.
Corneal confocal microscopy for identification of diabetic sensorimotor polyneuropathy: a pooled multinational consortium study
.
Diabetologia
2018
;
61
:
1856
1861
4.
Azmi
S
,
Jeziorska
M
,
Ferdousi
M
, et al
.
Early nerve fibre regeneration in individuals with type 1 diabetes after simultaneous pancreas and kidney transplantation
.
Diabetologia
2019
;
62
:
1478
1487
5.
Petropoulos
IN
,
Alam
U
,
Fadavi
H
, et al
.
Corneal nerve loss detected with corneal confocal microscopy is symmetrical and related to the severity of diabetic polyneuropathy
.
Diabetes Care
2013
;
36
:
3646
3651
Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered. More information is available at https://www.diabetesjournals.org/content/license.