Electronic Theses and Dissertations (Masters)
Permanent URI for this collection
Browse
Browsing Electronic Theses and Dissertations (Masters) by Author "Mandindi, Nkcubeko Umzubongile Siphamandla"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
Item Detecting and Understanding COVID-19 Misclassifications: A Deep Learning and Explainable AI Approach(University of the Witwatersrand, Johannesburg, 2023-08) Mandindi, Nkcubeko Umzubongile Siphamandla; Vadapalli, Hima BinduInterstitial Lung Disease (IDL) is a catch-all term for over 200 chronic lung diseases. These diseases are distinguished by lung tissue inflammation (Pulmonary fibrosis). They are histologically heterogeneous dis eases with inconsistent microscopic appearances, but they have clinical manifestations similar to other lung disorders. The similarities in symptoms of these diseases make differential diagnosis difficult and may lead to COVID-19 misdiagnosis with various types of IDLs. Be cause the turnaround time is shorter and more sensitive for diagnosis, imaging technology has been mentioned as a critical detection method in combating the prevalence of COVID-19. The aim of this research is to investigate existing deep learning architectures for the aforementioned task, as well as incorporate evaluation modules to determine where and why misclassification occurred. In this study, three widely used deep learning architectures, ResNet-50, VGG-19, and CoroNet, were evaluated for detecting COVID-19 from other IDLs (bacterial pneumonia, nor mal (healthy), viral pneumonia, and tuberculosis). The baseline results demonstrate the effectivities of Coronet having a classification performance of 84.02% for accuracy, specificity of 89.87%, a sensitivity of 70.97%. Recall 84.12%, and F1 score of 0.84. The results further emphasize the effectiveness of transfer learning using pre-trained domain-specific architectures, resulting in fewer learnable parameters. The proposed work used Integrated Gradients (IG), an Explainable AI technique that uses saliency maps to observe pixel feature importances, to understand mis classifications. This refers to visually prominent features in input im ages that were used by the model to make predictions. As a result, the proposed work envisions future research directions for improved classi fication through misclassification understanding.