advertisement

WGA Rescources

Abstract #82691 Published in IGR 20-4

Clinical Interpretable Deep Learning Model for Glaucoma Diagnosis

Liao W; Zou B; Zhao R; Chen Y; He Z; Zhou M
IEEE journal of biomedical and health informatics 2019; 0:


Despite the potential to revolutionize disease diagnosis by performing data-driven classification, clinical interpretability of ConvNet remains challenging. In this paper, a novel clinical interpretable ConvNet architecture is proposed not only for accurate glaucoma diagnosis but also for the more transparent interpretation by highlighting the distinguished regions recognized by the network. To the best of our knowledge, this is the first work of providing the interpretable diagnosis of glaucoma with the popular deep learning model. We propose a novel scheme for aggregating features from different scales to promote the performance of glaucoma diagnosis, which we refer to as MLAP. Moreover, by modeling the correspondence from binary diagnosis information to the spatial pixels, the proposed scheme generates glaucoma activations, which bridge the gap between global semantical diagnosis and precise location. In contrast to previous works, it can discover the distinguish local regions in fundus images as evidence for clinical interpretable glaucoma diagnosis. Experimental results, performed on the challenging ORIGA datasets, show that our method on glaucoma diagnosis outperforms state-of-the-art methods with the highest AUC (0.88). Remarkably, the extensive results, optic disc segmentation (Dice of 0.9) and local disease focus localization based on the evidence map, demonstrate the effectiveness of our methods on clinical interpretability.

Full article

Classification:

6.9.5 Other (Part of: 6 Clinical examination methods > 6.9 Computerized image analysis)
6.30 Other (Part of: 6 Clinical examination methods)
2.13 Retina and retinal nerve fibre layer (Part of: 2 Anatomical structures in glaucoma)



Issue 20-4

Change Issue


advertisement

Oculus