Skip to content

Commit cf41a43

Browse files
authored
Docs/release 25 (#997)
* Initial docs update for v2.25 * Added pcl example docs * Adding pointcloud control example * Updating pointcloud example
1 parent a6a24d6 commit cf41a43

File tree

13 files changed

+536
-3
lines changed

13 files changed

+536
-3
lines changed
12.1 KB
Loading
104 KB
Loading
824 KB
Loading
1.23 MB
Loading

docs/source/components/device.rst

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,13 @@ Device clocks are synced at below 500µs accuracy for PoE cameras, and below 200
6767

6868
.. image:: /_static/images/components/clock-syncing.png
6969

70-
A graph representing the accuracy of the device clock with respect to the host clock. We had 3 devices connected (OAK PoE cameras), all were hardware synchronized using `FSYNC Y-adapter <https://docs.luxonis.com/projects/hardware/en/latest/pages/FSYNC_Yadapter/>`__.
70+
Above is a graph representing the accuracy of the device clock with respect to the host clock. We had 3 devices connected (OAK PoE cameras), all were hardware synchronized using `FSYNC Y-adapter <https://docs.luxonis.com/projects/hardware/en/latest/pages/FSYNC_Yadapter/>`__.
7171
Raspberry Pi (the host) had an interrupt pin connected to the FSYNC line, so at the start of each frame the interrupt happened and the host clock was recorded. Then we compared frame (synced) timestamps with
72-
host timestamps and computed the standard deviation. For the histogram above we ran this test for approximately 3 hours.
72+
host timestamps and computed the standard deviation. For the histogram above we ran this test for approximately 3 hours.
73+
74+
Below is a graph representing the difference between the device and host clock. The graph shows the difference between the device and host clock over time. The graph is a result of the same test as the previous one.
75+
76+
.. image:: /_static/images/components/timestamp-difference.png
7377

7478
.. code-block:: python
7579
Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
PointCloudConfig
2+
================
3+
4+
`PointCloudConfig` is a configuration class used to adjust settings for point cloud generation within the DepthAI ecosystem. It allows users to configure properties such as sparsity and transformation matrices, which are crucial for tailoring the point cloud data to specific application requirements.
5+
6+
Configuration Options
7+
######################
8+
9+
- **Sparsity**: Determines whether the generated point cloud should be sparse. Sparse point clouds may omit points based on certain criteria, such as depth value thresholds, to reduce data volume and processing requirements.
10+
- **Transformation Matrix**: Applies a transformation matrix to the point cloud data, enabling rotations, translations, and scaling to align the point cloud with a world or application-specific coordinate system.
11+
12+
Usage
13+
#####
14+
15+
Configuring `PointCloudConfig` allows for precise control over the generation of point cloud data. Here's an example of how to configure and apply `PointCloudConfig` in a DepthAI application:
16+
17+
.. tabs::
18+
19+
.. code-tab:: py
20+
21+
import depthai as dai
22+
23+
# Create pipeline
24+
pipeline = dai.Pipeline()
25+
26+
# Create PointCloud node
27+
pointCloud = pipeline.create(dai.node.PointCloud)
28+
29+
pointCloud.initialConfig.setSparse(True) # Enable sparse point cloud generation
30+
31+
# Define a transformation matrix
32+
transformationMatrix = [
33+
[1.0, 0.0, 0.0, 0.0],
34+
[0.0, 1.0, 0.0, 0.0],
35+
[0.0, 0.0, 1.0, 0.0],
36+
[0.0, 0.0, 0.0, 1.0]
37+
]
38+
pointCloud.initialConfig.setTransformationMatrix(transformationMatrix) # Apply transformation matrix
39+
40+
# Further pipeline setup and execution...
41+
42+
.. code-tab:: c++
43+
44+
#include "depthai/depthai.hpp"
45+
46+
int main() {
47+
// Create pipeline
48+
dai::Pipeline pipeline;
49+
50+
// Create PointCloud node
51+
auto pointCloud = pipeline.create<dai::node::PointCloud>();
52+
53+
pointCloud->initialConfig.setSparse(true); // Enable sparse point cloud generation
54+
55+
// Define a transformation matrix
56+
std::vector<std::vector<float>> transformationMatrix = {
57+
{1.0, 0.0, 0.0, 0.0},
58+
{0.0, 1.0, 0.0, 0.0},
59+
{0.0, 0.0, 1.0, 0.0},
60+
{0.0, 0.0, 0.0, 1.0}
61+
};
62+
pointCloud->initialConfig.setTransformationMatrix(transformationMatrix); // Apply transformation matrix
63+
64+
// Further pipeline setup and execution...
65+
66+
return 0;
67+
}
68+
69+
This example demonstrates initializing `PointCloudConfig`, setting it to generate sparse point clouds, and applying a transformation matrix. This configuration is then applied to a `PointCloud` node within the DepthAI pipeline.
70+
71+
Examples of Functionality
72+
#########################
73+
74+
- **3D Object Localization**: Adjusting the transformation matrix to align point clouds with a known coordinate system for precise object placement.
75+
- **Scene Optimization**: Utilizing sparse point clouds for efficient processing in large-scale or complex scenes.
76+
- **Data Alignment**: Applying transformation matrices for seamless integration of point cloud data with other sensor data or pre-existing 3D models.
77+
78+
Reference
79+
#########
80+
81+
.. tabs::
82+
83+
.. tab:: Python
84+
85+
.. autoclass:: depthai.PointCloudConfig
86+
:members:
87+
:inherited-members:
88+
:noindex:
89+
90+
.. tab:: C++
91+
92+
.. doxygenclass:: dai::PointCloudConfig
93+
:project: depthai-core
94+
:members:
95+
:private-members:
96+
:undoc-members:
97+
98+
.. include:: ../../includes/footer-short.rst
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
PointCloudData
2+
==============
3+
4+
:ref:`PointCloudData` encapsulates 3D spatial information, representing a collection of points in a 3D space. Each point within the point cloud has its own position (X, Y, Z coordinates).
5+
PointCloudData is used to represent the output of :ref:`PointCloud` nodes, and can be used to perform a variety of spatial analysis and reconstruction tasks.
6+
7+
Setter methods are only used to provide metadata to the :ref:`PointCloudData` message (will be used for recording and replaying pointclouds).
8+
9+
Reference
10+
#########
11+
12+
The detailed API for `PointCloudData` offers control over the generation, manipulation, and retrieval of 3D point cloud data.
13+
14+
.. tabs::
15+
16+
.. tab:: Python
17+
18+
.. autoclass:: depthai.PointCloudData
19+
:members:
20+
:inherited-members:
21+
:noindex:
22+
23+
.. tab:: C++
24+
25+
.. doxygenclass:: dai::PointCloudData
26+
:project: depthai-core
27+
:members:
28+
:private-members:
29+
:undoc-members:
30+
31+
.. include:: ../../includes/footer-short.rst
Lines changed: 154 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,154 @@
1+
PointCloud
2+
==========
3+
4+
The PointCloud node enables on-device point cloud generation from depth map.
5+
6+
How to place it
7+
###############
8+
9+
.. tabs::
10+
11+
.. code-tab:: py
12+
13+
pipeline = dai.Pipeline()
14+
pointCloud = pipeline.create(dai.node.PointCloud)
15+
16+
.. code-tab:: c++
17+
18+
dai::Pipeline pipeline;
19+
auto pointCloud = pipeline.create<dai::node::PointCloud>();
20+
21+
22+
Inputs and Outputs
23+
##################
24+
25+
.. code-block::
26+
27+
┌─────────────────────┐
28+
│ │
29+
inputConfig │ PointCloud │
30+
──────────────►│ │ outputPointCloud
31+
│ ├────────────────►
32+
inputDepth │ │ passthroughDepth
33+
──────────────►│---------------------├────────────────►
34+
└─────────────────────┘
35+
36+
**Message types**
37+
38+
- :code:`inputDepth` - :ref:`ImgFrame`
39+
- :code:`inputConfig` - :ref:`PointCloudConfig`
40+
- :code:`outputPointCloud` - :ref:`PointCloudData`
41+
- :code:`passthroughDepth` - :ref:`ImgFrame` (passthrough input depth map)
42+
43+
44+
Example visualization with Open3D
45+
#################################
46+
47+
.. tabs::
48+
49+
.. code-tab:: py
50+
51+
import open3d as o3d
52+
import numpy as np
53+
import depthai as dai
54+
55+
pcd = o3d.geometry.PointCloud()
56+
vis = o3d.visualization.VisualizerWithKeyCallback()
57+
vis.create_window()
58+
59+
with dai.Device(pipeline) as device:
60+
coordinateFrame = o3d.geometry.TriangleMesh.create_coordinate_frame(size=1000, origin=[0,0,0])
61+
vis.add_geometry(coordinateFrame)
62+
63+
while device.isPipelineRunning():
64+
inMessage = q.get()
65+
inColor = inMessage["rgb"]
66+
inPointCloud = inMessage["pcl"]
67+
cvColorFrame = inColor.getCvFrame()
68+
69+
if inPointCloud:
70+
points = inPointCloud.getPoints().astype(np.float64)
71+
pcd.points = o3d.utility.Vector3dVector(points)
72+
colors = (cvRGBFrame.reshape(-1, 3) / 255.0).astype(np.float64)
73+
pcd.colors = o3d.utility.Vector3dVector(colors)
74+
vis.update_geometry(pcd)
75+
76+
vis.poll_events()
77+
vis.update_renderer()
78+
79+
vis.destroy_window()
80+
81+
82+
.. code-tab:: c++
83+
84+
#include <iostream>
85+
#include <open3d/Open3D.h>
86+
#include <depthai/depthai.hpp>
87+
88+
int main() {
89+
auto viewer = std::make_unique<pcl::visualization::PCLVisualizer>("Cloud Viewer");
90+
viewer->addPointCloud<pcl::PointXYZ>(cloud, "cloud");
91+
92+
dai::Device device(pipeline);
93+
94+
auto q = device.getOutputQueue("out", 8, false);
95+
auto qDepth = device.getOutputQueue("depth", 8, false);
96+
97+
while(true) {
98+
std::cout << "Waiting for data" << std::endl;
99+
auto depthImg = qDepth->get<dai::ImgFrame>();
100+
auto pclMsg = q->get<dai::PointCloudData>();
101+
102+
if(!pclMsg) {
103+
std::cout << "No data" << std::endl;
104+
continue;
105+
}
106+
107+
auto frame = depthImg->getCvFrame();
108+
frame.convertTo(frame, CV_8UC1, 255 / depth->initialConfig.getMaxDisparity());
109+
110+
if(pclMsg->getPoints().empty()) {
111+
std::cout << "Empty point cloud" << std::endl;
112+
continue;
113+
}
114+
115+
pcl::PointCloud<pcl::PointXYZ>::Ptr cloud = pclMsg->getPclData();
116+
viewer->updatePointCloud(cloud, "cloud");
117+
118+
viewer->spinOnce(10);
119+
120+
if(viewer->wasStopped()) {
121+
break;
122+
}
123+
}
124+
125+
126+
Examples using PointCloud
127+
#########################
128+
129+
- :ref:`PointCloud Visualization`
130+
- :ref:`PointCloud Control`
131+
132+
133+
Reference
134+
#########
135+
136+
.. tabs::
137+
138+
.. tab:: Python
139+
140+
.. autoclass:: depthai.node.PointCloud
141+
:members:
142+
:inherited-members:
143+
:noindex:
144+
145+
.. tab:: C++
146+
147+
.. doxygenclass:: dai::node::PointCloud
148+
:project: depthai-core
149+
:members:
150+
:private-members:
151+
:undoc-members:
152+
153+
154+
.. include:: ../../includes/footer-short.rst
Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,60 @@
1+
PointCloud Control
2+
==================
3+
4+
This example demonstrates how to use :ref:`PointCloudConfig` message to dynamically update the transformation matrix of a point cloud and visualize the transformed point cloud using Open3D.
5+
6+
The transformation matrix is updated to make it look like the point cloud is rotating about the Z-axis. This is achieved by first moving the pointcloud along Y axis:
7+
8+
.. code-block:: python
9+
10+
# Move the point cloud along Y axis
11+
translate_y_matrix = np.array([[1, 0, 0, 0],
12+
[0, 1, 0, 500],
13+
[0, 0, 1, 0],
14+
[0, 0, 0, 1]])
15+
16+
17+
Then, the point cloud is rotated about Z axis:
18+
19+
.. code-block:: python
20+
21+
# Rotate the point cloud about Z axis
22+
rotate_z_matrix = np.array([[np.cos(angle), -np.sin(angle), 0, 0],
23+
[np.sin(angle), np.cos(angle), 0, 0],
24+
[0, 0, 1, 0],
25+
[0, 0, 0, 1]])
26+
27+
28+
Demo
29+
####
30+
31+
.. image:: /_static/images/examples/pointcloud_control.gif
32+
:alt: PointCloud control
33+
34+
Setup
35+
#####
36+
37+
Ensure DepthAI and Open3D are installed in your Python environment:
38+
39+
.. code-block:: bash
40+
41+
python3 -m pip install depthai open3d
42+
43+
44+
Source code
45+
###########
46+
47+
The example initializes the DepthAI pipeline with color and mono cameras and a stereo depth node to generate depth information. It then creates a point cloud node, dynamically updates its transformation matrix based on a rotation value, and visualizes this transformed point cloud using Open3D. Each point's color corresponds to the color image captured by the RGB camera.
48+
49+
.. tabs::
50+
51+
.. tab:: Python
52+
53+
Also `available on GitHub <https://github.com/luxonis/depthai-python/blob/main/examples/PointCloud/pointcloud_control.py>`__
54+
55+
.. literalinclude:: ../../../../examples/PointCloud/pointcloud_control.py
56+
:language: python
57+
:linenos:
58+
59+
60+
.. include:: /includes/footer-short.rst

0 commit comments

Comments
 (0)