Code for the paper "Franke et al.: Revisiting Neural Activation Coverage for Uncertainty Estimation", accepted for a poster session @ ESANN 2026.
Contains minimal torch reimplementation of https://github.com/BierOne/ood_coverage, extended by a novel formulation for regression problems. Only the uncertainty estimation function is re-implemented.
We recommend using UV for dependency management.
Run uv sync to create the .venv.
Then run uv run <file> or activate the .venv by using ./.venv/bin/activate.
To run a script use python <file>, like normal.
If not using UV, install python3.13 and create the .venv derived from the pyproject.toml in the repository root.
For using only the NAC-Wrapper in your own project, we recommend doing uv add git+https://github.com/DLR-KI/nac-uncertainty-regression.
Alternatively you can simply copy nac_uncertainty_regression/nac.py into your project.
import torch
from nac_uncertainty_regression import NACWrapper
from torchvision.models import resnet18, ResNet18_Weights
from torchvision.datasets import Imagenette, MNIST
from torchvision.transforms.v2 import Compose, RGB
# init pretrained model + wrap it
model = resnet18(pretrained=True)
model = NACWrapper(model,
layer_name_list=[
"layer1.1.bn2", # use dot notation to access nested layers
"layer2.1.bn2",
"layer3.1.bn2",
"layer4.1.bn2",
"fc"
])
transform = ResNet18_Weights.IMAGENET1K_V1.transforms()
# define your data somehow - insert your own data here, just make sure the id_data is actually part of your model's training distribution
id_data_loader_fit = torch.utils.data.DataLoader(torch.utils.data.Subset(Imagenette(root="data/imagenette", download=True, transform=transform), indices=range(1024)), batch_size=32)
id_data_loader_eval = torch.utils.data.DataLoader(Imagenette(root="data/imagenette", download=True, transform=transform, split="val"), batch_size=32)
ood_data_loader = torch.utils.data.DataLoader(torch.utils.data.Subset(MNIST(root="data/mnist", download=True, transform=Compose([RGB(), transform])), indices=range(32)), batch_size=32)
# IMPORTANT: Do NOT use 'with torch.no_grad()', NACWrapper needs gradients internally!
# It will check if gradients are enabled and throw an error if not.
# do some forward passes with I.D. Data to init the wrapper distribution
model.train()
for x, y in id_data_loader_fit:
# only the call to forward is important
_ = model(x)
# set the model to eval to get uncertainty scores
model.eval()
# first get ID uncertainty scores
for x, y in id_data_loader_eval:
uncertainties = model(x)["uncertainty"]
mean_uncertainty_id = uncertainties.mean() # the model outputs are saved in key "out", uncertainty in key "uncertainty"
std_uncertainty_id = uncertainties.std()
break
# now get OOD uncertainty scores
for x, y in ood_data_loader:
uncertainties = model(x)["uncertainty"]
mean_uncertainty_ood = uncertainties.mean()
std_uncertainty_ood = uncertainties.std()
break
print(f"Mean ID Uncertainty Score: {mean_uncertainty_id}+-{std_uncertainty_id}")
print(f"Mean OoD Uncertainty Score: {mean_uncertainty_ood}+-{std_uncertainty_ood}") Run experiment_ood.bash to reproduce Figure 1 and experiment_mse.bash to reproduce Figure 2. Generate the Figures with viz.ipynb.
- nac.py -> The entire implementation alongside documentation
- nac_test.py -> Unit Tests
If you find the code or results useful, please cite the paper:
@inproceedings{franke2026revisiting,
title={Revisiting Neural Activation Coverage for Uncertainty Estimation},
author={Franke, Benedikt and Förster, Nils and and Köster, Frank and Fischer, Asja and Lange, Markus and Raulf, Arne Peter},
booktitle={34th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2026 (Scopus; ISSN:)},
year={2026}
}