Brain Tumor Classification Using Interpretable Deep Learning: A Step Toward Explainable AI in Medical Imaging
Author : Altamash Mannikeri
Abstract :Brain tumors rank among the most aggressive and potentially fatal medical conditions. demanding early and accurate detection for better survival outcomes. brain tumors pose a significant threat to life, making early detection crucial for improving survival outcomes. Accurate identification and classification of brain tumors enable timely treatment planning, offering a better chance at recovery and quality of life for affected patients. Magnetic Resonance Imaging (MRI) is the preferred method for diagnosing brain tumors, but because tumor characteristics are complex, manual analysis of these scans is frequently laborious, prone to errors, and inconsistent. This research integrates Convolutional Neural Networks (CNNs), specifically ResNet-50, to overcome these challenges in tumor classification. The ResNet-50 model shows remarkable accuracy in detecting and categorizing brain tumors into types such as glioma, meningioma, pituitary adenoma, as well as distinguishing healthy brain cases without tumors.Grad-CAM is employed to enhance interpretability by visually highlighting the areas of the brain tumor that impact the model’s predictions. This transparency is essential for gaining trust from medical professionals, as it addresses the clinical need for accountability and supports more informed diagnostic decisions.
Keywords :Explainable AI (XAI), Convolutional Neural Networks (CNN), Medical Imaging, Grad-CAM, ResNet-50, MobileNet-V2, Brain tumor Analysis.
Conference Name :International Conference on Health and Medicine (ICHM-25)
Conference Place Hyderabad India
Conference Date 11th May 2025