Jongmin Park1*
·
Minh-Quan Viet Bui1*
·
Juan Luis Gonzalez Bello1
·
Jaeho Moon1
·
Jihyong Oh2†
·
Munchurl Kim1†
1KAIST, South Korea, 2Chung-Ang University, South Korea
*Co-first authors (equal contribution), †Co-corresponding authors
Paper | Project Page | Code
- March 09, 2026: Released inference code and pre-trained models.
- Jan 03, 2026: Initial repository created.
The full code and pretrained models will be released soon.
- ✅ Inference code
- ✅ Pretrained models
- ⬛ Training scripts
- ⬛ Dataset generation scripts
Our code is developed using PyTorch 2.5.1, CUDA 12.4, and Python 3.11.
git clone https://github.com/KAIST-VICLab/EcoSplat.git
cd EcoSplat
conda create -y -n ecosplat python=3.11
conda activate ecosplat
bash setup.shOur pre-trained models are hosted on Hugging Face 🤗.
We assume the downloaded weights are located in the pretrained_weights directory.
Please refer to DATASETS.md for dataset preparation.
To evaluate EcoSplat on RealEstate10K, run the following command. You can adjust the primitive_ratio as needed.
# RealEstate10K (enable test.align_pose=true if using evaluation-time pose alignment)
python -m src.main +experiment=ecosplat/re10k mode=test wandb.name=re10k \
dataset/view_sampler@dataset.re10k.view_sampler=evaluation \
dataset.re10k.view_sampler.index_path=assets/evaluation_index_re10k_small_16views.json \
checkpointing.load=./pretrained_weights/ecosplat-stage2-re10k.ckpt \
model.encoder.primitive_ratio=<PRIMITIVE_RATIO> \
test.save_image=true test.align_pose=true \
test.output_path=<YOUR_OUTPUT_PATH>
This project is built upon these excellent repositories: SPFSplat, NoPoSplat, pixelSplat, DUSt3R, and CroCo. We thank the original authors for their excellent work.
@article{park2025ecosplat,
title={EcoSplat: Efficiency-controllable Feed-forward 3D Gaussian Splatting from Multi-view Images},
author={Park, Jongmin and Bui, Minh-Quan Viet and Bello, Juan Luis Gonzalez and Moon, Jaeho and Oh, Jihyong and Kim, Munchurl},
journal={arXiv preprint arXiv:2512.18692},
year={2025}
}