Code for the paper "Correcting misinterpretations of additive models", accepted to NeurIPS 2025: https://openreview.net/forum?id=2ClM0g9OFT.
The ./nam/ subfolder is adapted from lemeln/nam.
Set up the environment using the provided environment.yml:
# Create the environment
conda env create -f environment.yml
# Activate the environment
conda activate <env_name>To generate XAI-TRIS data and save it to ./artifacts/tetris/data/neurips, run:
python -m data.generate_data data_config 8x8 neurips- This uses scenarios from
./data/data_config.json. generate_linesearch.pycreates datasets for SNR-tuning line search, which is then run inlinesearch.py.visualise_linesearch.ipynbgenerates training result plots.xai_tris_train_models.pytrains the final model parameterizations. Ensure the data folder path at the top of each script matches your setup (e.g.,./artifacts/tetris/data/...).xai_tris_shape_fns.ipynbshows shape functions and the interaction plots for XAI-TRISxai_tris_explanations.pygenerates global explanations for the notebooks:xai_tris_qualitative.ipynbxai_tris_quantitative_metrics.ipynb
- These notebooks generate the plots shown in the paper.
fni_emd.ipynbdemonstrates explanation styles in the appendix and their metric scores.
recid.datais used for experiments.compas_nam.ipynbtrains the NAM and generates shape functions and PatternGAM results.compas_correlation.pycreates correlation plots for the appendix.
Original data and related articles: propublica/compas-analysis.
mimic_preprocessing.pyprocesses raw MIMIC-IV data for the 24-hour mortality task (tested with v2.0).mimic_nam.pyruns the experiments shown in the main text. Reduce epochs or learners for quicker results.
Data source: MIMIC-IV v2.0 (requires training, data use agreement, and access request).