Personalized Concept-based Image Classification Explanation Framework

Name
Karl-Gustav Kallasmaa
Abstract
In recent years, the adaptation of machine learning models has proliferated. Explaining those models is essential for the end-users to install trust and mitigate potential algorithmic biases. Most current interpretability techniques predominantly rely on pixel or feature importance, making it challenging to intuitively explain these results to humans. This Thesis introduces a novel local concept-based explanation framework designed to explain image classification models. The framework empowers users to create personalized explanations through intelligent concept suggestions. These chosen concepts are used to train a shallow decision tree that is used to explain the image classifier. Additionally, the framework allows users to request a re-explanation by modifying the concepts and do receive a counterfactual explanation. The frameworks' effectiveness was tested by explaining the ResNet-50 image classifier decisions on the ADE20K dataset. The framework demonstrated a higher fidelity than LIME for this dataset and model. The intuitiveness and meaningfulness were measured through human-centric evaluations. These experiments showed that the frameworks' explanations are more intuitive than LIME.
Graduation Thesis language
English
Graduation Thesis type
Master - Computer Science
Supervisor(s)
Radwa ElShawi
Defence year
2023
 
PDF