This repository is the official implementation of MUTE-SLAM: Real-Time Neural SLAM with Multiple Tri-plane Hash Representations
First you can create the environment and install the necessary dependencies. You can easily achieve this by using anaconda.
# Create the environment
conda env create -f environment.yaml
conda activate mute_slamWe use the encoding from torch-ngp by default, you can also use tiny-cuda-nn as another option. To install tiny-cuda-nn, run:
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
cd tiny-cuda-nn/bindings/torch
python setup.py installDownload the data from Replica dataset as below:
bash scripts/download_replica.shFollow the instruction on ScanNet, and extract color/depth frames from the .sens file using this code.
Download the data from TUM RGB-D as below:
bash scripts/download_tum.shYou can run MUTE-SLAM with:
python run.py configs/{Dataset}/{scene}.yamlTo evaluate the tracking result, run:
python src/tools/eval_ate.py configs/{Dataset}/{scene}.yamlTo evaluate the reconstruction results, download the ground truth Replica meshes first:
bash scripts/download_replica_mesh.shThen cull the unseen and occluded regions from the ground truth meshes:
GT_MESH=cull_replica_mesh/{scene}.ply
python src/tools/cull_mesh.py configs/Replica/{scene}.yaml --input_mesh $GT_MESHFinally, run the code below to evaluate the reconstructed mesh:
OUTPUT_FOLDER=output/Replica/{scene}
GT_MESH=cull_replica_mesh/{scene}_culled.ply
python src/tools/eval_recon.py --rec_mesh $OUTPUT_FOLDER/mesh/final_mesh_eval_rec_culled.ply --gt_mesh $GT_MESH -2d -3d