The Official Pytorch Implementation for
Depth Anything in 360°: Towards Scale Invariance in the Wild
Authors: Hualie Jiang, Ziyang Song, Zhiqiang Lou, Rui Xu, Minglang Tan
Create the environment
conda env create -f environment.yaml
conda activate da360
pip install -r requirements.txtbash scripts/download_models.sh
The pretrained models are available on Google Drive and can be downloaded manually.
Put the panoramic images in ./data/images first then run the following command. There are six examples already.
python test.py --model_path ./checkpoints/DA360_large.pth --model_name DA360_large
The results would lie in ./checkpoints/DA360_large/results/.
bash scripts/evaluate.sh
The project is paritally based on Depth Anything V2, PanDA and UniFuse.
Please cite our paper if you find our work useful in your research.
@article{jiang2025depth,
title={Depth Anything in $360\^{}$\backslash$circ $: Towards Scale Invariance in the Wild},
author={Jiang, Hualie and Song, Ziyang and Lou, Zhiqiang and Xu, Rui and Tan, Minglang},
journal={arXiv preprint arXiv:2512.22819},
year={2025}
}
