GeoNeXt: Efficient landslide mapping using a pre-trained ConvNeXt V2 encoder with a PSA-ASPP decoder
Rodrigo Uribe-Ventura, Willem Viveen, Ferdinand Pineda-Ancco, César Beltrán-Castallon
GeoNeXt is a novel deep learning architecture for landslide detection and mapping that combines a pre-trained ConvNeXt V2 encoder with a decoder integrating Pyramid Squeeze Attention (PSA) and Atrous Spatial Pyramid Pooling (ASPP). The model achieves state-of-the-art performance with 10× fewer parameters than transformer-based approaches while maintaining superior accuracy across diverse geographical regions.
- High Performance: F1 scores of 94.25%, 86.43%, and 92.27% on Bijie, Landslide4Sense, and GVLM datasets
- Computational Efficiency: 32.2M parameters vs 312.5M for SAM-based methods
- Zero-shot Transferability: Strong cross-domain performance without retraining
- Multi-scale Detection: Handles landslides from small flows to large-scale events
GeoNeXt follows a U-Net-based encoder-decoder structure:
- Encoder: Pre-trained ConvNeXt V2-Tiny with domain adaptation
- Decoder: Novel PSA-ASPP design for multi-scale feature aggregation
- Loss Function: Composite loss combining BCE, uncertainty-weighted Dice, and focal-Tversky
# Clone the repository
git clone https://github.com/ruribev/GeoNeXt.git
cd GeoNeXt
# Install dependencies
pip install -r requirements.txtDownload the following required folders from Google Drive:
- Pre-trained Weights: Download pretrained folder
- Trained Models: Download experiments folder
- Datasets: Download dataset folder
Extract these folders in the root directory of the project to maintain the proper structure.
from utils.experiments import TrainingExperiment, run_experiment
# Configure training experiment
config = {
'dataset': 'BJL', # or 'L4S', 'GVLM', 'CAS'
'model_variant': 'tiny',
'gpu_id': 0,
'batch_size': 8,
'epochs': 60,
'lr': 3.5e-3,
'convnextv2_pretrained_weights': 'pretrained/convnextv2_tiny_22k_224_ema.pt',
'name_suffix': 'your_experiment_name'
}
# Run training
experiment = TrainingExperiment(config)
run_experiment(experiment)from utils.benchmarks import Benchmark
# Initialize benchmark
benchmark = Benchmark(
dataset='BJL',
gpu_id=0,
model_variant='tiny',
model_weight_path="experiments/GeoNeXt_BJL/best_model.pth.tar",
batch_size=2
)
# Run evaluation
results = benchmark.run_benchmark()
benchmark.show_metrics()
benchmark.show_examples(num_examples=5, min_mask_percentage=0.1)We provide ready-to-use Jupyter notebooks:
train.ipynb: Complete training pipeline examplebenchmark.ipynb: Evaluation of main datasets
- Download the complete dataset folder from this link
- Extract and place in the project root directory
- The structure should be:
dataset/
├── BJL/
├── L4S/
├── GVLM/
└── CAS/
GeoNeXt/
├── models/
│ ├── __init__.py
│ ├── GeoNeXt.py # Main model architecture
│ └── ConvNeXtV2.py # ConvNeXt V2 backbone
├── utils/
│ ├── __init__.py
│ ├── train.py # Training utilities
│ ├── evaluator.py # Evaluation metrics
│ ├── benchmarks.py # Benchmarking tools
│ ├── experiments.py # Experiment management
│ ├── augmentations.py # Data augmentation
│ ├── losses.py # Loss functions
│ ├── ema.py # Exponential moving average
│ └── dataset.py # Dataset utilities
├── dataset/ # Dataset directory (download from link above)
├── pretrained/ # Pre-trained weights (download from link above)
├── experiments/ # Training outputs (download from link above)
├── train.ipynb # Training notebook
├── benchmark.ipynb # Evaluation notebook
├── requirements.txt # Dependencies
├── LICENSE # MIT License
└── README.md # This file
Pre-trained models and weights are available in the downloads section:
-
ConvNeXt V2 Tiny Backbone: ImageNet-22K pre-trained weights (32.2M parameters)
- File:
convnextv2_tiny_22k_224_ema.pt - Location: pretrained folder
- File:
-
GeoNeXt Complete Models: Fine-tuned models for each dataset
- BJL Dataset:
GeoNeXt_BJL/best_model.pth.tar(F1: 94.25%) - L4S Dataset:
GeoNeXt_L4S/best_model.pth.tar(F1: 86.43%) - GVLM Dataset:
GeoNeXt_GVLM/best_model.pth.tar(F1: 92.27%) - CAS Domain-adapted:
GeoNeXt_CAS/encoder_weights.pth - Location: experiments folder
- BJL Dataset:
Download the pretrained folder and experiments folder to access all pre-trained models and weights.
This project is licensed under the MIT License - see the LICENSE file for details.
If you use GeoNeXt in your research, please cite our work:
@article{uribe2025geonext,
title={GeoNeXt: Efficient landslide mapping using a pre-trained ConvNeXt V2 encoder with a PSA-ASPP decoder},
author={Uribe-Ventura, Rodrigo and Viveen, Willem and Pineda-Ancco, Ferdinand and Beltr{\'a}n-Casta{\~n}on, C{\'e}sar},
journal={Artificial Intelligence in Geosciences},
year={2025},
publisher={Elsevier},
doi={10.1016/j.aiig.2025.100172}
}
Paper currently under review. BibTeX entry will be updated upon publication.
- Corresponding Author: Rodrigo Uribe-Ventura (a20234215@pucp.edu.pe)
- Institution: Pontificia Universidad Católica del Perú
- GitHub: @ruribev
For detailed methodology and experimental results, please refer to our paper: "GeoNeXt: Efficient landslide mapping using a pre-trained ConvNeXt V2 encoder with a PSA-ASPP decoder"