This repository contains the core code for our research on explainable AI applied to single-cell malaria diagnosis. At its heart is a full reproduction of the Soft-Attention Parallel CNN (SPCNN) from Ahamed et al. (2025) — including network architecture and training on the NIH malaria image dataset. Since the original authors did not release their code, we implemented and validated SPCNN end-to-end (noting small performance gaps likely due to unavailable hyperparameter details).
Building on that foundation, we integrate five complementary XAI techniques (Grad-CAM, Grad-CAM++, SHAP-Gradient, SHAP-Deep, and LIME) and provide interactive demo notebooks to generate, visualize, and compare their explanations. Our goal is to go beyond heatmaps: by engaging domain experts, we’ll assess each method’s clarity, usefulness, and trustworthiness in real-world diagnostic workflows.
configs/• hyperparams & YAML filesdata/• loader scripts and data transformationsdocs/• other docs for websitemodels/• SPCNN model and model factorynotebooks/• demo notebook with demo datasetexplainability/• Grad-CAM, Grad-CAM++, SHAP, LIME wrapperstraining/• training pipelines for SPCNNscripts/• all scripts for running SPCNN and XAIutils/• helpers
-
Dataset
We use the NLM-Falciparum-Thin-Cell-Images dataset (27 558 cropped RGB images of Giemsa-stained thin blood smear red blood cells), provided by the Lister Hill National Center for Biomedical Communications (LHNCBC), U.S. National Library of Medicine (2019), with expert annotations from the Mahidol Oxford Tropical Medicine Research Unit. Data is available at the NLM malaria datasheet -
Baseline Model
A faithful reimplementation of the Soft Attention Parallel Convolutional Neural Network (SPCNN) from Ahamed et al., Scientific Reports (2025) including all architectural details and hyperparameters upon availability. -
Explainability Methods
Integrated wrappers for Grad-CAM, Grad-CAM++, SHAP (Deep and Gradient variants), and LIME to generate both visual heatmaps and quantitative feature attributions.
Try out our interactive demo to see all five XAI methods in action:
- Open
notebooks/demo.ipynbin your browser (e.g. upload to Google Colab for zero-install execution). - Run each cell to load the pretrained SPCNN, generate Grad-CAM, Grad-CAM++, SHAP-Gradient, SHAP-Deep, and LIME explanations, and compare them.
To run locally:
- Clone this repo
- Install dependencies with
conda env create -f environment.yaml conda activate xai4malaria-demo
- Launch Jupyter and oipen
notebooks/demo.ipynb
This project is the result of a joint effort between:
-
Universidad Carlos III de Madrid, Neuroscience & Biomedical Sciences Department
— Prof. Arrate Muñoz-Barrutia — Dr. Caterina Fuster-Barceló -
Universitat de les Illes Balears, Department of Mathematics & Computer Science
— Dr. Cristina Suemay Manresa Yee
— Dr. Silvia Ramis Guarinos