|
| 1 | +Message syncing |
| 2 | +############### |
| 3 | + |
| 4 | +There are two ways to synchronize messages from different sensors (frames, IMU packet, ToF, etc.); |
| 5 | + |
| 6 | +- :ref:`Software syncing <Software syncing>` (based on timestamp/sequence numbers) |
| 7 | +- `Hardware syncing <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html>`__ (multi-sensor sub-ms accuracy, hardware trigger) |
| 8 | + |
| 9 | +Software syncing |
| 10 | +**************** |
| 11 | + |
| 12 | +This documentation page focuses on software syncing. There are two approaches for it: |
| 13 | + |
| 14 | +- :ref:`Sequece number syncing` - for streams set to the same FPS, sub-ms accuracy can be achieved |
| 15 | +- :ref:`Timestamp syncing` - for streams with different FPS, syncing with other sensors either onboard (eg. IMU) or also connected to the host computer (eg. USB ToF sensor) |
| 16 | + |
| 17 | +Sequece number syncing |
| 18 | +====================== |
| 19 | + |
| 20 | +If we want to synchronize multiple messages from the same OAK, such as: |
| 21 | + |
| 22 | +- Camera frames from `ColorCamera <https://docs.luxonis.com/projects/api/en/latest/components/nodes/color_camera/#colorcamera>`__ or `MonoCamera <https://docs.luxonis.com/projects/api/en/latest/components/nodes/mono_camera/#monocamera>`__ (color, left and right frames) |
| 23 | +- Messages generated from camera frames (NN results, disparity/depth, edge detections, tracklets, encoded frames, tracked features, etc.) |
| 24 | + |
| 25 | +We can use sequence number syncing, `demos here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#message-syncing>`__. |
| 26 | +Each frame from ColorCamera/MonoCamera will get assigned a sequence number, which then also gets copied to message generated from that frame. |
| 27 | + |
| 28 | +For sequence number syncing **FPS of all cameras need to be the same**. On host or inside script node you can get message's sequence number like this: |
| 29 | + |
| 30 | +.. code-block:: python |
| 31 | +
|
| 32 | + # Get the message from the queue |
| 33 | + message = queue.get() |
| 34 | + # message can be ImgFrame, NNData, Tracklets, ImgDetections, TrackedFeatures... |
| 35 | + seqNum = message.getSequenceNum() |
| 36 | +
|
| 37 | +
|
| 38 | +Through firmware sync, we're monitoring for drift and aligning the capture timestamps of all cameras (left, right, color), which are taken |
| 39 | +at the MIPI Start-of-Frame (SoF) event. The Left/Right global shutter cameras are driven by the same clock, started by broadcast write |
| 40 | +on I2C, so no drift will happen over time, even when running freely without a hardware sync. |
| 41 | + |
| 42 | +The RGB rolling shutter has a slight difference in clocking/frame-time, so when we detect a small drift, we're modifying the |
| 43 | +frame-time (number of lines) for the next frame by a small amount to compensate. |
| 44 | + |
| 45 | +If sensors are set to the same FPS (default is 30), the above two approaches are **already integrated into depthai and enabled** |
| 46 | +by default, which allows us to **achieve sub-ms delay between all frames** + messages generated by these frames! |
| 47 | + |
| 48 | +.. code-block:: bash |
| 49 | +
|
| 50 | + [Seq 325] RGB timestamp: 0:02:33.549449 |
| 51 | + [Seq 325] Disparity timestamp: 0:02:33.549402 |
| 52 | + ----------- |
| 53 | + [Seq 326] RGB timestamp: 0:02:33.582756 |
| 54 | + [Seq 326] Disparity timestamp: 0:02:33.582715 |
| 55 | + ----------- |
| 56 | + [Seq 327] RGB timestamp: 0:02:33.616075 |
| 57 | + [Seq 327] Disparity timestamp: 0:02:33.616031 |
| 58 | +
|
| 59 | +Disparity and color frame timestamps indicate that we achieve well below sub-ms accuracy. |
| 60 | + |
| 61 | +Timestamp syncing |
| 62 | +================= |
| 63 | + |
| 64 | +As opposed to sequence number syncing, **timestamp syncing** can sync: |
| 65 | + |
| 66 | +- **streams** with **different FPS** |
| 67 | +- **IMU** results with other messages |
| 68 | +- messages with **other devices connected to the computer**, as timestamps are synced to the host computer clock |
| 69 | + |
| 70 | +Feel free to check the `demo here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#imu--rgb--depth-timestamp-syncing>`__ |
| 71 | +which uses timestamps to sync IMU, color and disparity frames together, with all of these streams producing messages at different FPS. |
| 72 | + |
| 73 | +In case of **multiple streams having different FPS**, there are 2 options on how to sync them: |
| 74 | + |
| 75 | +#. **Removing some messages** from faster streams to get the synced FPS of the slower stream |
| 76 | +#. **Duplicating some messages** from slower streams to get the synced FPS of the fastest stream |
| 77 | + |
| 78 | +**Timestamps are assigned** to the frame at the **MIPI Start-of-Frame** (SoF) events, |
| 79 | +`more details here <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html#frame-capture-graphs>`__. |
| 80 | + |
| 81 | +.. include:: /includes/footer-short.rst |
0 commit comments