Skip to content

Commit 0259106

Browse files
authored
Merge pull request #688 from luxonis/sw_frame_syncing
added software syncing msgs docs
2 parents 3aea75a + cc9dd29 commit 0259106

File tree

5 files changed

+106
-38
lines changed

5 files changed

+106
-38
lines changed

docs/source/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -72,6 +72,7 @@ node functionalities are presented with code.
7272

7373
tutorials/hello_world.rst
7474
tutorials/standalone_mode.rst
75+
tutorials/message_syncing.rst
7576
tutorials/multiple.rst
7677
tutorials/maximize_fov.rst
7778
tutorials/debugging.rst

docs/source/samples/StereoDepth/rgb_depth_aligned.rst

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,10 @@ RGB Depth alignment
44
This example shows usage of RGB depth alignment. Since OAK-D has a color and a pair of stereo cameras,
55
you can align depth map to the color frame on top of that to get RGB depth.
66

7+
In this example, rgb and depth aren't perfectly in sync. For that, you would need to add :ref:`Software syncing`, which
8+
has been added to the `demo here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#host-rgb-depth-sync>`__,
9+
where RGB and depth frames have sub-ms delay.
10+
711
Demo
812
####
913

docs/source/samples/mixed/frame_sync.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
Frame syncing on OAK
22
====================
33

4-
This example showcases how you can use :ref:`Script` node to sync frames from multiple streams. It uses :ref:`ImgFrame`'s timestamps to achieve syncing precision.
4+
This example showcases how you can use :ref:`Script` node to perform :ref:`Message syncing` of multiple streams.
5+
Example uses :ref:`ImgFrame`'s timestamps to achieve syncing precision.
56

67
Similar syncing demo scripts (python) can be found at our depthai-experiments repository in `gen2-syncing <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing>`__
78
folder.
Lines changed: 81 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,81 @@
1+
Message syncing
2+
###############
3+
4+
There are two ways to synchronize messages from different sensors (frames, IMU packet, ToF, etc.);
5+
6+
- :ref:`Software syncing <Software syncing>` (based on timestamp/sequence numbers)
7+
- `Hardware syncing <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html>`__ (multi-sensor sub-ms accuracy, hardware trigger)
8+
9+
Software syncing
10+
****************
11+
12+
This documentation page focuses on software syncing. There are two approaches for it:
13+
14+
- :ref:`Sequece number syncing` - for streams set to the same FPS, sub-ms accuracy can be achieved
15+
- :ref:`Timestamp syncing` - for streams with different FPS, syncing with other sensors either onboard (eg. IMU) or also connected to the host computer (eg. USB ToF sensor)
16+
17+
Sequece number syncing
18+
======================
19+
20+
If we want to synchronize multiple messages from the same OAK, such as:
21+
22+
- Camera frames from `ColorCamera <https://docs.luxonis.com/projects/api/en/latest/components/nodes/color_camera/#colorcamera>`__ or `MonoCamera <https://docs.luxonis.com/projects/api/en/latest/components/nodes/mono_camera/#monocamera>`__ (color, left and right frames)
23+
- Messages generated from camera frames (NN results, disparity/depth, edge detections, tracklets, encoded frames, tracked features, etc.)
24+
25+
We can use sequence number syncing, `demos here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#message-syncing>`__.
26+
Each frame from ColorCamera/MonoCamera will get assigned a sequence number, which then also gets copied to message generated from that frame.
27+
28+
For sequence number syncing **FPS of all cameras need to be the same**. On host or inside script node you can get message's sequence number like this:
29+
30+
.. code-block:: python
31+
32+
# Get the message from the queue
33+
message = queue.get()
34+
# message can be ImgFrame, NNData, Tracklets, ImgDetections, TrackedFeatures...
35+
seqNum = message.getSequenceNum()
36+
37+
38+
Through firmware sync, we're monitoring for drift and aligning the capture timestamps of all cameras (left, right, color), which are taken
39+
at the MIPI Start-of-Frame (SoF) event. The Left/Right global shutter cameras are driven by the same clock, started by broadcast write
40+
on I2C, so no drift will happen over time, even when running freely without a hardware sync.
41+
42+
The RGB rolling shutter has a slight difference in clocking/frame-time, so when we detect a small drift, we're modifying the
43+
frame-time (number of lines) for the next frame by a small amount to compensate.
44+
45+
If sensors are set to the same FPS (default is 30), the above two approaches are **already integrated into depthai and enabled**
46+
by default, which allows us to **achieve sub-ms delay between all frames** + messages generated by these frames!
47+
48+
.. code-block:: bash
49+
50+
[Seq 325] RGB timestamp: 0:02:33.549449
51+
[Seq 325] Disparity timestamp: 0:02:33.549402
52+
-----------
53+
[Seq 326] RGB timestamp: 0:02:33.582756
54+
[Seq 326] Disparity timestamp: 0:02:33.582715
55+
-----------
56+
[Seq 327] RGB timestamp: 0:02:33.616075
57+
[Seq 327] Disparity timestamp: 0:02:33.616031
58+
59+
Disparity and color frame timestamps indicate that we achieve well below sub-ms accuracy.
60+
61+
Timestamp syncing
62+
=================
63+
64+
As opposed to sequence number syncing, **timestamp syncing** can sync:
65+
66+
- **streams** with **different FPS**
67+
- **IMU** results with other messages
68+
- messages with **other devices connected to the computer**, as timestamps are synced to the host computer clock
69+
70+
Feel free to check the `demo here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#imu--rgb--depth-timestamp-syncing>`__
71+
which uses timestamps to sync IMU, color and disparity frames together, with all of these streams producing messages at different FPS.
72+
73+
In case of **multiple streams having different FPS**, there are 2 options on how to sync them:
74+
75+
#. **Removing some messages** from faster streams to get the synced FPS of the slower stream
76+
#. **Duplicating some messages** from slower streams to get the synced FPS of the fastest stream
77+
78+
**Timestamps are assigned** to the frame at the **MIPI Start-of-Frame** (SoF) events,
79+
`more details here <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html#frame-capture-graphs>`__.
80+
81+
.. include:: /includes/footer-short.rst

docs/source/tutorials/multiple.rst

Lines changed: 18 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -1,35 +1,22 @@
11
Multiple DepthAI per Host
22
=========================
33

4-
Learn how to discover DepthAI devices connected to your system, and use them individually.
4+
You can find `Demo scripts here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-multiple-devices>`__.
5+
Learn how to discover multiple OAK cameras connected to your system, and use them individually.
56

67
.. image:: /_static/images/tutorials/multiple/setup.jpg
78
:alt: face
89

9-
Shown on the left is Luxonis `uAI (BW1093) <https://shop.luxonis.com/products/bw1093>`__ which is actually plugged into
10-
a `Raspberry Pi Compute Module Edition (BW1097) <https://shop.luxonis.com/products/depthai-rpi-compute-module-edition>`__.
10+
Shown on the left is Luxonis `OAK-1 <https://shop.luxonis.com/products/bw1093>`__ which is actually plugged into
11+
an `OAK-D-CM3 <https://shop.luxonis.com/products/depthai-rpi-compute-module-edition>`__.
1112

12-
So in this case, everything is running on the (single) Raspberry Pi 3B+ which is in the back of the BW1097.
13+
So in this case, everything is running on the (single) Raspberry Pi 3B+ host which is in the back of the OAK-D-CM3.
1314

14-
Demo code
15-
#########
15+
Discovering OAK cameras
16+
#######################
1617

17-
You can find demo code `here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-multiple-devices>`__. The demo will find all devices connected to the host and display an RGB preview from each of them.
18-
19-
Dependencies
20-
############
21-
22-
You have already set up the Python API on your system (if you have a Raspberry Pi Compute Module it came pre-setup).
23-
See :ref:`here <Python API Reference>` if you have not yet installed the DepthAI Python API on your system.
24-
25-
Discover DepthAI-USB Port Mapping
26-
#################################
27-
28-
The DepthAI multi-device support is currently done by selecting the device mx_id (serial number) of a connected DepthAI
29-
device.
30-
31-
If you'd like to associate a given DepthAI device with specific code (e.g. neural model) to be run on it, it is recommended
32-
to plug in one device at a time, and then use the following code to determine which device is on which port:
18+
You can use DepthAI to discover all connected OAK cameras, either via USB or through the LAN (OAK POE cameras).
19+
The code snippet below finds all OAK cameras and prints their MxIDs (unique identifier) and their XLink state.
3320

3421
.. code-block:: python
3522
@@ -52,28 +39,22 @@ For example, if the first device is desirable from above use the following code:
5239

5340
.. code-block:: python
5441
55-
found, device_info = depthai.Device.getDeviceByMxId("14442C10D13EABCE00")
56-
57-
if not found:
58-
raise RuntimeError("Device not found!")
59-
60-
You can then use the `device_info` to specify on which device you want to run your pipeline:
61-
62-
.. code-block:: python
63-
42+
# Specify MXID, IP Address or USB path
43+
device_info = depthai.DeviceInfo("14442C108144F1D000") # MXID
44+
#device_info = depthai.DeviceInfo("192.168.1.44") # IP Address
45+
#device_info = depthai.DeviceInfo("3.3.3") # USB port name
6446
with depthai.Device(pipeline, device_info) as device:
47+
# ...
6548
6649
And you can use this code as a basis for your own use cases, such that you can run differing neural models
67-
on different DepthAI/uAI models.
50+
on different OAK models.
6851

6952
Specifying POE device to be used
7053
********************************
7154

72-
You can specify the POE device to be used by the IP address as well. Here's the `code snippet <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/getting-started-with-poe.html#manually-specify-device-ip>`__.
73-
74-
Now use as many DepthAI devices as you need!
55+
You can specify the POE device to be used by the IP address as well, as shown in the code snippet above.
7556

76-
And since DepthAI does all the heavy lifting, you can usually use quite a
77-
few of them with very little burden to the host.
57+
Now use as many OAK cameras as you need!
58+
And since DepthAI does all the heavy lifting, you can usually use quite a few of them with very little burden to the host.
7859

7960
.. include:: /includes/footer-short.rst

0 commit comments

Comments
 (0)