You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Each Azure Kinect DK device includes 3.5-mm synchronization ports (**Sync in** and **Sync out**) that you can use to link multiple units together. When linked, your software can coordinate the trigger timing of multiple depth cameras and RGB cameras.
15
15
16
-
In this article, we will explore the benefits of multi-device synchronization and its details.
16
+
This article introduces some of the benefits of synchronizing multiple devices, and provides instructions for how to connect the devices.
17
17
18
18
## Why use multiple Azure Kinect DK devices?
19
19
@@ -32,9 +32,9 @@ There are many reasons to use multiple Azure Kinect DK devices. Examples include
32
32
33
33
Before you start, make sure to review [Azure Kinect DK Hardware specification](hardware-specification.md) and [Azure Kinect DK depth camera](depth-camera.md).
34
34
35
-
### Select a camera configuration
35
+
### Select a device configuration
36
36
37
-
You can use two different approaches for your camera configuration:
37
+
You can use two different approaches for your device configuration:
38
38
39
39
-**Daisy-chain configuration**. Synchronize one master device and up to eight subordinate devices.
40
40

@@ -57,27 +57,29 @@ The trigger source must deliver the signal to the master device **Sync in** port
57
57
58
58

59
59
60
+
For more information about working with external equipment, see [Use Azure Kinect recorder with external synchronized devices](record-external-synchronized-units.md)
61
+
60
62
### Plan your camera settings and software configuration
61
63
62
64
For information about how to set up your software to control the cameras and use the image data, see the [Azure Kinect Sensor SDK](about-sensor-sdk.md).
63
65
64
-
When you are working with multiple synchronized devices,
66
+
This section addresses several factors that affect synchronized devices (but not single devices). Your software should take these factors into account.
65
67
66
68
#### Exposure considerations
67
-
If you want to control the precise timing of each device, we recommend that you use a manual exposure setting. Under the automatic exposure setting, each color camera can dynamically change the actual exposure. Because the exposure affects the timing, such changes quickly push the cameras out of synch.
69
+
If you want to control the precise timing of each device, we recommend that you use a manual exposure setting. Under the automatic exposure setting, each color camera can dynamically change the actual exposure. Because the exposure affects the timing, such changes quickly push the cameras out of sync.
68
70
69
71
In the image capture loop, avoid repeatedly setting the same exposure setting. When needed, just call the API once.
70
72
71
73
#### Timestamp considerations
72
-
Cameras that are acting in master or subordinate roles report image timestamps in terms of *Start of Frame* instead of *Center of Frame*.
74
+
Devices that are acting in master or subordinate roles report image timestamps in terms of *Start of Frame* instead of *Center of Frame*.
73
75
74
76
#### Avoiding interference between multiple depth cameras
75
77
76
78
When multiple depth cameras image overlapping fields of view, each camera must image its own associated laser. To prevent the lasers from interfering with each other, the camera captures should be offset from one another by 160μs or more.
77
79
78
80
For each depth camera capture, the laser turns on nine times and is active for only 125μs each time. The laser is then idle for either 14505μs or 23905μs, depending on the mode of operation. This behavior means that the starting point for the offset calculation is 125μs.
79
81
80
-
Additionally, difference between the camera clock and the device firmware clock increase the minimum offset to 160μs. To calculate a more precise offset for your configuration, note the depth mode that you are using and refer to the [depth sensor raw timing table](hardware-specification.md). Using the data from this table, you can calculate the minimum offset (the exposure time of each camera) by using the following equation:
82
+
Additionally, difference between the camera clock and the device firmware clock increase the minimum offset to 160μs. To calculate a more precise offset for your configuration, note the depth mode that you are using and refer to the [depth sensor raw timing table](hardware-specification.md#depth-sensor-raw-timing). Using the data from this table, you can calculate the minimum offset (the exposure time of each camera) by using the following equation:
@@ -87,19 +89,21 @@ In your software, use ```depth_delay_off_color_usec``` or ```subordinate_delay_o
87
89
88
90
## Prepare your devices and other hardware
89
91
92
+
In addition to multiple Azure Kinect DK devices, you may have to obtain additional host computers and other hardware in order to support the configuration you want to build. Use the information in this section to make sure that all of your devices and hardware are ready before you begin setting up.
93
+
90
94
### Azure Kinect DK devices
91
95
92
96
For each of the Azure Kinect DK devices that you want to synchronize, do the following:
93
97
94
-
- Ensure that the latest firmware is installed on the device. For more info about updating your devices, go to [Update Azure Kinect DK]().
98
+
- Ensure that the latest firmware is installed on the device. For more info about updating your devices, go to [Update Azure Kinect DK firmware](update-device-firmware.md).
95
99
- Remove the device cover to reveal the sync ports.
96
100
- Note the serial number for each device. You will use this number later in the setup process.
97
101
98
102
### Host computers
99
103
100
104
Typically, each Azure Kinect DK uses its own host computer. You can use a dedicated host controller, depending on how you're using the device and the amount of data being transferred over USB.
101
105
102
-
Make sure that the Azure Kinect Sensor SDK is installed on each host computer. For more info on installing Sensor SDK, go to [Set up Azure Kinect DK]().
106
+
Make sure that the Azure Kinect Sensor SDK is installed on each host computer. For more info on installing the Sensor SDK, go to [Quickstart: Set up your Azure Kinect DK](set-up-azure-kinect-dk.md).
103
107
104
108
#### Linux computers: USB memory on Ubuntu
105
109
@@ -120,15 +124,13 @@ By default, Linux-based host computers allocate the USB controller only 16 MB of
120
124
121
125
### Cables
122
126
123
-
To connect the cameras to each other and to the host computers, you need 3.5-mm male-to-male cables (also known as 3.5-mm audio cable). The cables should be less than 10 meters long, and may be stereo or mono.
127
+
To connect the devices to each other and to the host computers, you need 3.5-mm male-to-male cables (also known as 3.5-mm audio cable). The cables should be less than 10 meters long, and may be stereo or mono.
124
128
125
-
The number of cables that you need depends on the number of cameras that you are using as well as the specific device configuration. The Azure Kinect DK box does not include cables—you must purchase them separately.
129
+
The number of cables that you need depends on the number of devices that you are using as well as the specific device configuration. The Azure Kinect DK box does not include cables—you must purchase them separately.
126
130
127
-
If you're connecting the cameras in the star configuration, you also need one headphone splitter.
131
+
If you're connecting the devices in the star configuration, you also need one headphone splitter.
128
132
129
-
## Set up multiple Azure Kinect DK devices
130
-
131
-
### Connect your devices
133
+
## Connect your devices
132
134
133
135
**To connect Azure Kinect DK devices in a daisy chain configuration**
134
136
@@ -147,12 +149,14 @@ If you're connecting the cameras in the star configuration, you also need one he
147
149
1. Connect 3.5-mm audio cables to the "split" ends of the headphone splitter.
148
150
1. Plug the other end of each cable into the **Sync in** port of one of the subordinate devices.
149
151
150
-
###Calibrate the devices as a synchronized set
152
+
## Calibrate the devices as a synchronized set
151
153
152
154
In a single device, the depth and RGB cameras are factory calibrated to work together. However, when multiple devices have to work together, they need to be calibrated in order to determine how to transform an image from the domain of the camera that captured it to the domain of the camera you want to use to process images.
153
155
154
156
There are multiple options for cross-calibrating devices. Microsoft provides the [GitHub green screen code sample](https://github.com/microsoft/Azure-Kinect-Sensor-SDK/tree/develop/examples/green_screen), which uses the OpenCV method. The Readme file for this code sample provides more details and instructions for calibrating the devices.
155
157
158
+
For more general information about calibration, see [Use Azure Kinect calibration functions](use-calibration-functions.md).
159
+
156
160
## Verify that the devices are connected and communicating
157
161
158
162
To verify that the devices are connected correctly, use [Azure Kinect Viewer](azure-kinect-viewer.md).
@@ -165,7 +169,7 @@ To verify that the devices are connected correctly, use [Azure Kinect Viewer](az
165
169
1. For each subordinate device in the chain, follow these steps. Start with the device furthest from the master device, and work back toward the master device.
166
170
> [!IMPORTANT]
167
171
> To get precise image capture alignment between all devices, you have to start the master device last.
168
-
1. Open an instance of [Azure Kinect Viewer](azure-kinect-viewer.md).
172
+
1. Open an instance of Azure Kinect Viewer.
169
173
1. Under **Open Device**, select the serial number of the device that you want to open.
170
174

171
175
1. Under **External Sync**, select **Sub**.
@@ -184,7 +188,7 @@ When the master Azure Kinect Device starts, the synchronized image from each of
184
188
1. For each of the subordinate devices, follow these steps.
185
189
> [!IMPORTANT]
186
190
> To get precise image capture alignment between all devices, you have to start the master device last.
187
-
1. Open an instance of [Azure Kinect Viewer](azure-kinect-viewer.md).
191
+
1. Open an instance of Azure Kinect Viewer.
188
192
1. Under **Open Device**, select the serial number of the device that you want to open.
189
193

190
194
1. Under **External Sync**, select **Sub**.
@@ -200,15 +204,13 @@ When the master Azure Kinect Device starts, the synchronized image from all of t
0 commit comments