Skip to content

Official implementation of "Depth Anything in 360°: Towards Scale Invariance in the Wild".

License

Notifications You must be signed in to change notification settings

Insta360-Research-Team/DA360

Repository files navigation

Depth Anything in 360°: Towards Scale Invariance in the Wild

The Official Pytorch Implementation for

Depth Anything in 360°: Towards Scale Invariance in the Wild

Authors: Hualie Jiang, Ziyang Song, Zhiqiang Lou, Rui Xu, Minglang Tan

Preparation

Installation

Create the environment

conda env create -f environment.yaml
conda activate da360
pip install -r requirements.txt

Evaluation

Download the pre-trained models

bash scripts/download_models.sh

The pretrained models are available on Google Drive and can be downloaded manually.

Test On Panoramic Images

Put the panoramic images in ./data/images first then run the following command. There are six examples already.

python test.py  --model_path ./checkpoints/DA360_large.pth --model_name DA360_large

The results would lie in ./checkpoints/DA360_large/results/.

Perfom Evaluation

bash scripts/evaluate.sh

Acknowledgements

The project is paritally based on Depth Anything V2, PanDA and UniFuse.

Citation

Please cite our paper if you find our work useful in your research.

@article{jiang2025depth,
  title={Depth Anything in $360\^{}$\backslash$circ $: Towards Scale Invariance in the Wild},
  author={Jiang, Hualie and Song, Ziyang and Lou, Zhiqiang and Xu, Rui and Tan, Minglang},
  journal={arXiv preprint arXiv:2512.22819},
  year={2025}
}

About

Official implementation of "Depth Anything in 360°: Towards Scale Invariance in the Wild".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •