Skip to content

Reimplementation of the SPCNN model for malaria detection using blood smear images, with an extended comparative evaluation of explainability techniques (Grad-CAM, SHAP, LIME, etc.) in human-in-the-loop settings. Focused on user-centered validation with biologists and clinicians.

License

Notifications You must be signed in to change notification settings

cfusterbarcelo/XAI4Malaria

Repository files navigation

XAI4Malaria

Introduction

This repository contains the core code for our research on explainable AI applied to single-cell malaria diagnosis. At its heart is a full reproduction of the Soft-Attention Parallel CNN (SPCNN) from Ahamed et al. (2025) — including network architecture and training on the NIH malaria image dataset. Since the original authors did not release their code, we implemented and validated SPCNN end-to-end (noting small performance gaps likely due to unavailable hyperparameter details).

Building on that foundation, we integrate five complementary XAI techniques (Grad-CAM, Grad-CAM++, SHAP-Gradient, SHAP-Deep, and LIME) and provide interactive demo notebooks to generate, visualize, and compare their explanations. Our goal is to go beyond heatmaps: by engaging domain experts, we’ll assess each method’s clarity, usefulness, and trustworthiness in real-world diagnostic workflows.

📁 Project Structure

  • configs/ • hyperparams & YAML files
  • data/ • loader scripts and data transformations
  • docs/ • other docs for website
  • models/ • SPCNN model and model factory
  • notebooks/ • demo notebook with demo dataset
  • explainability/ • Grad-CAM, Grad-CAM++, SHAP, LIME wrappers
  • training/ • training pipelines for SPCNN
  • scripts/ • all scripts for running SPCNN and XAI
  • utils/ • helpers

🎯 What’s Inside

  • Dataset
    We use the NLM-Falciparum-Thin-Cell-Images dataset (27 558 cropped RGB images of Giemsa-stained thin blood smear red blood cells), provided by the Lister Hill National Center for Biomedical Communications (LHNCBC), U.S. National Library of Medicine (2019), with expert annotations from the Mahidol Oxford Tropical Medicine Research Unit. Data is available at the NLM malaria datasheet

  • Baseline Model
    A faithful reimplementation of the Soft Attention Parallel Convolutional Neural Network (SPCNN) from Ahamed et al., Scientific Reports (2025) including all architectural details and hyperparameters upon availability.

  • Explainability Methods
    Integrated wrappers for Grad-CAM, Grad-CAM++, SHAP (Deep and Gradient variants), and LIME to generate both visual heatmaps and quantitative feature attributions.

Demo Notebook

Try out our interactive demo to see all five XAI methods in action:

  1. Open notebooks/demo.ipynb in your browser (e.g. upload to Google Colab for zero-install execution).
  2. Run each cell to load the pretrained SPCNN, generate Grad-CAM, Grad-CAM++, SHAP-Gradient, SHAP-Deep, and LIME explanations, and compare them.

To run locally:

  • Clone this repo
  • Install dependencies with
    conda env create -f environment.yaml
    conda activate xai4malaria-demo
  • Launch Jupyter and oipen notebooks/demo.ipynb

Ownership & Collaborators

This project is the result of a joint effort between:

About

Reimplementation of the SPCNN model for malaria detection using blood smear images, with an extended comparative evaluation of explainability techniques (Grad-CAM, SHAP, LIME, etc.) in human-in-the-loop settings. Focused on user-centered validation with biologists and clinicians.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published