Hongruixuan Chen1,†, Jian Song1,†, Weihao Xuan2,1,†, Junjue Wang2,†, Heli Qi1, Zeqi Zhou3, Pengyu Dai1,2
Olivier Dietrich4, Erika Gutierrez5, Lars Bromly5, Edoardo Nemni6, Yafei Ou1, Jie Zhao7, Zhuo Zheng8, Yonghao Xu9
Ronny Hänsch10, Wenzhe Jiao11, Marco Chini12, Claudio Persello13, Junshi Xia1, Shijian Lu14, Lixin Wang15, Zhe Zhu16
Evan Shelhamer17, Jocelyn Chanussot18, Konrad Schindler4, Naoto Yokoya2,1
†Equal contribution
1 RIKEN AIP, 2 The University of Tokyo, 3 Brown University, 4 ETH Zurich, 5 United Nations Satellite Centre
6 Barcelona School of Economics, 7 Technical University of Munich, 8 Stanford University, 9 Linköping University
10 German Aerospace Center (DLR), 11 Texas A&M University, 12 Luxembourg Institute of Science and Technology
13 University of Twente, 14 Nanyang Technological University, 15 Indiana University Indianapolis
16 The University of British Columbia, 17 University of Connecticut, 18 Université Grenoble Alpes
Paper | Installation | Dataset Preparation | Pretrained Weights | Quick Start | Repo Layout
Any Disaster Mapping is the official repository for our review paper in EO-based disaster mapping.
One of our key motivations is that current disaster mapping research is highly fragmented: benchmarks, tasks, and model implementations are often inconsistent across papers, making fair evaluation, reproduction, and further development unnecessarily difficult.
This repo unifies widely used disaster mapping benchmarks and representative deep learning models across major research directions, and provides a consistent training and evaluation pipeline for:
- Infrastructure damage
- Flood mapping
- Landslide segmentation
- Wildfire analysis
It is designed to help researchers:
- Reproduce the results reported in our paper
- Evaluate models under a unified protocol
- Use strong baselines out of the box
- Build and test their own improvements with minimal engineering overhead
Base environment and optional model-specific extras.
# NOTE: --index-url should match the version of your local CUDA toolkit for compiling ChangeMamba kernels (cu126 is just an example)
pip install torch torchvision xformers --index-url https://download.pytorch.org/whl/cu126
pip install -e .Some models require optional extras:
- ChangeMamba selective scan kernel:
# run `conda install -c conda-forge gcc=13 gxx=13 -y` if you meet GCC issues cd src/models/ChangeMamba/kernels/selective_scan pip install . --no-build-isolation
- Local pretrained checkpoints under
pretrained_weight/for model families such as SegFormer, HRNet, SAM/SAM2, DINOv3, HyperSigma, SkySense, SpectralGPT, and ChangeMamba. See Pretrained Weights below.
Dataset preparation guides are organized by disaster domain:
- Infrastructure damage: scripts/data_prep/infra_damage/README.md
- Flood: scripts/data_prep/flood/README.md
- Landslide: scripts/data_prep/landslide/README.md
- Wildfire: scripts/data_prep/wildfire/README.md
Recommended local checkpoint layout for the supported model zoo.
Create the local checkpoint directory first:
mkdir -p pretrained_weight# pretrain-vit-base-e199.pth
wget -O pretrained_weight/pretrain-vit-base-e199.pth \
https://zenodo.org/records/7338613/files/pretrain-vit-base-e199.pth
# SpectralGPT+.pth
wget -O "pretrained_weight/SpectralGPT+.pth" \
"https://zenodo.org/records/8412455/files/SpectralGPT+.pth?download=1"
# spec-vit-base-ultra-checkpoint-1599.pth
wget -O pretrained_weight/spec-vit-base-ultra-checkpoint-1599.pth \
https://huggingface.co/WHU-Sigma/HyperSIGMA/resolve/main/spec-vit-base-ultra-checkpoint-1599.pth
huggingface-cli download UTokyo-Yokoya-Lab/AnyDisaster-Pretrained_Weight \
vssm_tiny_0230_ckpt_epoch_262.pth --local-dir pretrained_weight --local-dir-use-symlinks False-
DINOv3 Source: https://ai.meta.com/resources/models-and-libraries/dinov3-downloads/ Download and save as:
dinov3_vitb16_pretrain_lvd1689m-73cec8be.pth(ViT-B/16, LVD-1689M)dinov3_vitl16_pretrain_lvd1689m-8aa4cbdd.pth(ViT-L/16, LVD-1689M)dinov3_vitl16_pretrain_sat493m-eadcf0ff.pth(ViT-L/16, SAT-493M)
-
SAM v1 Source: https://github.com/facebookresearch/segment-anything Download and save as:
sam_vit_b_01ec64.pth(SAM ViT-B)sam_vit_l_0b3195.pth(SAM ViT-L)
-
SAM 2.1 Source: https://github.com/facebookresearch/sam2 Download and save as:
sam2.1_hiera_small.pt(SAM 2.1 Hiera-Small)sam2.1_hiera_base_plus.pt(SAM 2.1 Hiera-Base+)
-
SegFormer MiT encoders Source: https://github.com/NVlabs/SegFormer Download and save as:
mit_b0.pthmit_b1.pthmit_b2.pthmit_b3.pthmit_b4.pthmit_b5.pth
-
HyperSIGMA spatial backbone Source: https://huggingface.co/WHU-Sigma/HyperSIGMA Download the upstream file, rename, and save as:
HSI_spatial_checkpoint-1600.pth
-
SkySense backbone Source:
- https://github.com/Jack-bo1220/SkySense
- https://www.notion.so/SkySense-Checkpoints-a7fcff6ce29a4647a08c7fe416910509
Select the
hr(high-resolution RGB / RGBNIR) variant, not thes2Sentinel-2 variant. Save as: skysense_model_backbone_hr.pthFor commercial use, contact the authors (yansheng.li@whu.edu.cn).
After completing all downloads, pretrained_weight/ should contain:
pretrained_weight/
├── dinov3_vitb16_pretrain_lvd1689m-73cec8be.pth
├── dinov3_vitl16_pretrain_lvd1689m-8aa4cbdd.pth
├── dinov3_vitl16_pretrain_sat493m-eadcf0ff.pth
├── HSI_spatial_checkpoint-1600.pth
├── mit_b0.pth
├── mit_b1.pth
├── mit_b2.pth
├── mit_b3.pth
├── mit_b4.pth
├── mit_b5.pth
├── pretrain-vit-base-e199.pth
├── sam2.1_hiera_base_plus.pt
├── sam2.1_hiera_small.pt
├── sam_vit_b_01ec64.pth
├── sam_vit_l_0b3195.pth
├── skysense_model_backbone_hr.pth
├── spec-vit-base-ultra-checkpoint-1599.pth
├── SpectralGPT+.pth
└── vssm_tiny_0230_ckpt_epoch_262.pth
Minimal training and evaluation commands.
Train with a YAML config:
python train.py --config configs/infra/xbd/unet.yamlEvaluate an experiment directory:
python test.py --exp_path results/xbd/unetMain directories and what they are responsible for:
src/core/: trainer, config loader, registry, augmentation, metricssrc/tasks/: task handlers for segmentation, change detection, and semantic change detectionsrc/datasets/: dataset adapters and runtime data contractssrc/models/: model wrappers and vendored third-party implementationsconfigs/: experiment configs grouped by domain and datasetscripts/data_prep/: dataset preparation guides and helper scriptsdocs/: architecture notes and extension guidance
Internal runtime design and extension entry points.
- Architecture overview: docs/ARCHITECTURE.md
- Extension guide: docs/EXTENSION_GUIDE.md
If this code repo contributes to your research, please kindly consider citing our paper and give this repo ⭐️ :)
For any questions, please feel free to leave it in the issue section or send inquiry email to qschrx@gmail.com.