This repository contains the following point cloud partition algorithms:
- Voxel Cloud Connectivity Segmentation (VCCS)
- P-Linkage
- Cut-Pursuit
- Voxel and Graph-based Segmentation (VGS)
- Supervoxel and Graph-based Segmentation (SVGS)
- Random Sample Consensus (RANSAC) Segmentation
- Region Growing Algorithm
Some of these algorithms are only available in C++. Therefore, we developed a Boost.Python interface so that they all can be called from Python. Moreover, we extended the VCCS, P-Linkage, Cut-Pursuit, VGS/SVGS so that the edges of a mesh are used. The extensions produce more accurate eigenfeatures and, thus, sharper edges. All algorithms can be executed on Windows, Linux and MacOS. The algorithms can be seen in action on YouTube where they are integrated in OpenXtract.
It is recommended to create a miniconda environment for the compilation of the partition algorithms. Please install the required python packages:
pip install -r requirements.txt
You should also install Boost with
conda install -c anaconda boost
as it is used for the C++/Python interface. Please compile the libgeo first (see README) so that the extended algorithms can be tested.
The exact compilation steps for each algorithm can be found in the corresponding subfolder. For instance, the README file in the vccs folder contains the compilation steps of the voxel cloud connectivity segmentation (VCCS) algorithm. The algorithms can be compiled with CMake. The RANSAC and Region Growing algorithm do not require any compilation. The extended algorithms use the libgeo library which approx. the geodesic distances between vertices by calculating the shortest paths. Optionally, it can be compiled and tested in isolation.
Navigate to a folder of one of the partition algorithms. Edit the test.py script and load an arbitrary .ply file. To do so, search for the line mesh = o3d.io.read_triangle_mesh and provide a path to a .ply file on your disk. After that, execute the test.py script.
Make sure, that the original and the extended partition algorithms can be executed. These algorithms can be compared against each other. To do so, download the complete ScanNet version 2 dataset. Save it to an arbitrary place on disk. After that, create an environment variable called SCANNET_DIR that points to the dataset. The path SCANNET_DIR/scans should be a valid path. After that, you can call the script baseline.py:
python baseline.py
This will create a folder ./csvs_baseline where the accuracies and the partitition sizes will be stored in csv-files. These csv-files have to be concatenated. This can be done with the script concat_csvs.py:
python concat_csvs.py --csv_dir ./csvs_baseline --file baseline.csv
After that, you can compute the partition results for the extended partition algorithms:
python baseline.py --mesh True
This will create a folder ./csvs_mesh. The csv-files in this folder have to be concatenated. This can be done with the script concat_csvs.py:
python concat_csvs.py --csv_dir ./csvs_mesh --file mesh.csv
After that, you can start the jupyter notebook Analyze.ipynb and execute all cells.
@inproceedings{Tiator2022,
author = {Marcel Tiator and Calvin Huhn and Christian Geiger and Paul Grimm},
city = {Virtual Event},
journal = {Proceedings of the 5th International Conference on Artificial Intelligence and Virtual Reality - AIVR '22},
publisher = {IEEE},
title = {OpenXtract: A Blender Add-On for the Accelerated Extraction of the Objects of Interest},
year = {2022},
}
This project is sponsored by: German Federal Ministry of Education and Research (BMBF) under the project number 13FH022IX6. Project name: Interactive body-near production technology 4.0 (German: Interaktive körpernahe Produktionstechnik 4.0 (iKPT4.0))
