Skip to content

Commit 63d20f2

Browse files
authored
Merge pull request #917 from jakaskerl/main
Adding code examples in docs
2 parents 600e863 + e5f2c09 commit 63d20f2

File tree

12 files changed

+764
-4
lines changed

12 files changed

+764
-4
lines changed
8.49 MB
Loading

docs/source/components/device.rst

Lines changed: 115 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,117 @@ subnet, you can specify the device (either with MxID, IP, or USB port name) you
5555
with depthai.Device(pipeline, device_info) as device:
5656
# ...
5757
58+
59+
Watchdog
60+
########
61+
62+
Understanding the Watchdog Mechanism in POE Devices
63+
----------------------------------------------------
64+
65+
The watchdog is a crucial component in the operation of POE (Power over Ethernet) devices with DepthAI. When DepthAI disconnects from a POE device, the watchdog mechanism is the first to respond, initiating a reset of the camera. This reset is followed by a complete system reboot, which includes the loading of the DepthAI bootloader and the initialization of the entire networking stack.
66+
67+
.. note::
68+
This process is necessary to make the camera available for reconnection and typically takes about 10 seconds, which means the fastest possible reconnection time is 10 seconds.
69+
70+
71+
Customizing the Watchdog Timeout
72+
--------------------------------
73+
74+
.. tabs::
75+
76+
.. tab:: **Linux/MacOS**
77+
78+
Set the environment variables `DEPTHAI_WATCHDOG_INITIAL_DELAY` and `DEPTHAI_BOOTUP_TIMEOUT` to your desired timeout values (in milliseconds) as follows:
79+
80+
.. code-block:: bash
81+
82+
DEPTHAI_WATCHDOG_INITIAL_DELAY=<my_value> DEPTHAI_BOOTUP_TIMEOUT=<my_value> python3 script.py
83+
84+
.. tab:: **Windows PowerShell**
85+
86+
For Windows PowerShell, set the environment variables like this:
87+
88+
.. code-block:: powershell
89+
90+
$env:DEPTHAI_WATCHDOG_INITIAL_DELAY=<my_value>
91+
$env:DEPTHAI_BOOTUP_TIMEOUT=<my_value>
92+
python3 script.py
93+
94+
.. tab:: **Windows CMD**
95+
96+
In Windows CMD, you can set the environment variables as follows:
97+
98+
.. code-block:: guess
99+
100+
set DEPTHAI_WATCHDOG_INITIAL_DELAY=<my_value>
101+
set DEPTHAI_BOOTUP_TIMEOUT=<my_value>
102+
python3 script.py
103+
104+
Code-Based Configuration
105+
------------------------
106+
107+
Alternatively, you can set the timeout directly in your code:
108+
109+
.. code-block:: python
110+
111+
pipeline = depthai.Pipeline()
112+
113+
# Create a BoardConfig object
114+
config = depthai.BoardConfig()
115+
116+
# Set the parameters
117+
config.watchdogInitialDelayMs = <my_value>
118+
config.watchdogTimeoutMs = <my_value>
119+
120+
pipeline.setBoardConfig(config)
121+
122+
By adjusting these settings, you can tailor the watchdog functionality to better suit your specific requirements.
123+
124+
125+
Environment Variables
126+
#####################
127+
128+
The following table lists various environment variables used in the system, along with their descriptions:
129+
130+
.. list-table::
131+
:widths: 50 50
132+
:header-rows: 1
133+
134+
* - Environment Variable
135+
- Description
136+
* - `DEPTHAI_LEVEL`
137+
- Sets logging verbosity, options: 'trace', 'debug', 'warn', 'error', 'off'
138+
* - `XLINK_LEVEL`
139+
- Sets logging verbosity of XLink library, options: 'debug', 'info', 'warn', 'error', 'fatal', 'off'
140+
* - `DEPTHAI_INSTALL_SIGNAL_HANDLER`
141+
- Set to 0 to disable installing Backward signal handler for stack trace printing
142+
* - `DEPTHAI_WATCHDOG`
143+
- Sets device watchdog timeout. Useful for debugging (DEPTHAI_WATCHDOG=0), to prevent device reset while the process is paused.
144+
* - `DEPTHAI_WATCHDOG_INITIAL_DELAY`
145+
- Specifies delay after which the device watchdog starts.
146+
* - `DEPTHAI_SEARCH_TIMEOUT`
147+
- Specifies timeout in milliseconds for device searching in blocking functions.
148+
* - `DEPTHAI_CONNECT_TIMEOUT`
149+
- Specifies timeout in milliseconds for establishing a connection to a given device.
150+
* - `DEPTHAI_BOOTUP_TIMEOUT`
151+
- Specifies timeout in milliseconds for waiting the device to boot after sending the binary.
152+
* - `DEPTHAI_PROTOCOL`
153+
- Restricts default search to the specified protocol. Options: any, usb, tcpip.
154+
* - `DEPTHAI_DEVICE_MXID_LIST`
155+
- Restricts default search to the specified MXIDs. Accepts comma separated list of MXIDs. Lists filter results in an "AND" manner and not "OR"
156+
* - `DEPTHAI_DEVICE_ID_LIST`
157+
- Alias to MXID list. Lists filter results in an "AND" manner and not "OR"
158+
* - `DEPTHAI_DEVICE_NAME_LIST`
159+
- Restricts default search to the specified NAMEs. Accepts comma separated list of NAMEs. Lists filter results in an "AND" manner and not "OR"
160+
* - `DEPTHAI_DEVICE_BINARY`
161+
- Overrides device Firmware binary. Mostly for internal debugging purposes.
162+
* - `DEPTHAI_BOOTLOADER_BINARY_USB`
163+
- Overrides device USB Bootloader binary. Mostly for internal debugging purposes.
164+
* - `DEPTHAI_BOOTLOADER_BINARY_ETH`
165+
- Overrides device Network Bootloader binary. Mostly for internal debugging purposes.
166+
167+
168+
58169
Multiple devices
59170
################
60171

@@ -120,7 +231,7 @@ Two examples would be:
120231
- encoding (most prominently H264/H265 as frame drops can lead to artifacts).
121232

122233
Blocking behaviour
123-
******************
234+
------------------
124235

125236
By default, queues are **blocking** and their size is **30**, so when the device fills up a queue and when the limit is
126237
reached, any additional messages from the device will be blocked and the library will wait until it can add new messages to the queue.
@@ -133,7 +244,8 @@ It will wait for the host to consume (eg. :code:`queue.get()`) a message before
133244
to be empty again.
134245

135246
Non-Blocking behaviour
136-
**********************
247+
----------------------
248+
137249
Making the queue non-blocking will change the behavior in the situation described above - instead of waiting, the library will discard
138250
the oldest message and add the new one to the queue, and then continue its processing loop (so it won't get blocked).
139251
:code:`maxSize` determines the size of the queue and it also helps to control memory usage.
@@ -142,7 +254,7 @@ For example, if a message has 5MB of data, and the queue size is 30, this queue
142254
up to 150MB of data in the memory on the host (the messages can also get really big, for instance, a single 4K NV12 encoded frame takes about ~12MB).
143255

144256
Some additional information
145-
***************************
257+
---------------------------
146258

147259
- Decreasing the queue size to 1 and setting non-blocking behavior will effectively mean "I only want the latest packet from the queue".
148260
- Queues are thread-safe - they can be accessed from any thread.
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
UVC
2+
===
3+
4+
The DepthAI UVC (`USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`__) node allows OAK devices to function as standard webcams. This feature is particularly useful for integrating OAK devices into applications that require video input, such as video conferencing tools or custom video processing applications.
5+
6+
What is UVC?
7+
############
8+
9+
UVC refers to the USB Video Class standard, which is a USB device class that describes devices capable of streaming video. This standard allows video devices to interface with computers and other devices without needing specific drivers, making them immediately compatible with a wide range of systems and software.
10+
11+
How Does the UVC Node Work?
12+
###########################
13+
14+
The UVC node in DepthAI leverages this standard to stream video from OAK devices. When the UVC node is enabled, the OAK device is recognized as a standard webcam by the host system. This allows the device to be used in any application that supports webcam input, such as Zoom, Skype, or custom video processing software.
15+
16+
The UVC node streams video data over a USB connection. It is important to use a USB3 cable for this purpose, as USB2 may not provide the necessary bandwidth for stable video streaming.
17+
18+
.. note::
19+
20+
The UVC node can currently handle NV12 video streams from OAK devices. For streams in other formats, conversion to NV12 is necessary, which can be achieved using the :ref:`ImageManip` node. It's important to note that streams incompatible with NV12 conversion, like depth streams, are not supported by the UVC node.
21+
22+
Examples of UVC Node Usage
23+
##########################
24+
25+
1. **DepthAI Demo Script**: The DepthAI demo script includes a UVC application that can be run to enable the UVC node on an OAK device.
26+
27+
.. code-block:: bash
28+
29+
python3 depthai_demo.py --app uvc
30+
31+
2. **Custom Python Script**: A custom Python script can be written to enable the UVC node and configure the video stream parameters. Here are some pre-written examples:
32+
33+
- :ref:`UVC & Color Camera`
34+
- :ref:`UVC & Mono Camera`
35+
- :ref:`UVC & Disparity`
36+
37+
38+
3. **OBS Forwarding**: For applications where direct UVC node usage is not possible, OBS Studio can be used to forward the UVC stream.
39+
40+
41+
Reference
42+
#########
43+
44+
.. tabs::
45+
46+
.. tab:: Python
47+
48+
.. autoclass:: depthai.node.UVC
49+
:members:
50+
:inherited-members:
51+
:noindex:
52+
53+
.. tab:: C++
54+
55+
.. doxygenclass:: dai::node::UVC
56+
:project: depthai-core
57+
:members:
58+
:private-members:
59+
:undoc-members:
60+
61+
62+
.. include:: ../../includes/footer-short.rst
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
Feature Tracker with Motion Estimation
2+
======================================
3+
4+
This example demonstrates the capabilities of the :ref:`FeatureTracker` combined with motion estimation. It detects and tracks features between consecutive frames using optical flow.
5+
Each feature is assigned a unique ID. The motion of the camera is estimated based on the tracked features, and the estimated motion (e.g., Up, Down, Left, Right, Rotating) is displayed on screen.
6+
7+
The :ref:`Feature Detector` example only detects features without estimating motion.
8+
9+
Demo
10+
####
11+
12+
.. image:: ../../../../docs/source/_static/images/examples/feature_motion_estimation.gif
13+
:alt: Feature Tracker with Motion Estimation Demo
14+
:width: 100%
15+
:align: center
16+
17+
18+
Setup
19+
#####
20+
21+
.. include:: /includes/install_from_pypi.rst
22+
23+
Source code
24+
###########
25+
26+
.. tabs::
27+
28+
.. tab:: Python
29+
30+
Also `available on GitHub <https://github.com/luxonis/depthai-python/blob/main/examples/FeatureTracker/feature_motion_estimation.py>`__
31+
32+
.. literalinclude:: ../../../../examples/FeatureTracker/feature_motion_estimation.py
33+
:language: python
34+
:linenos:
35+
36+
37+
.. include:: /includes/footer-short.rst
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
UVC & Disparity
2+
===============
3+
4+
This example demonstrates how to use your OAK device as a UVC webcam. The UVC feature allows you to use your OAK device as a regular webcam in applications like OpenCV's :code:`cv2.VideoCapture()`, native camera apps, and more.
5+
6+
.. rubric:: How It Works:
7+
8+
The :ref:`StereoDepth` node outputs image data in the UINT8 format. However, the :ref:`UVC` node expects the data in NV12 format. To bridge this gap, an intermediary :ref:`ImageManip` node is used to convert the GRAY8 output from the MonoCamera node to NV12 format, which is then passed to the UVC node for streaming.
9+
This doesn't work with stereo depth output, since depth is UINT16 which we cannot convert to NV12.
10+
11+
This example won't work if we enable the subpixel disparity feature, since that outputs UINT16 as well.
12+
13+
.. rubric:: Similar samples:
14+
15+
- :ref:`UVC & Color Camera`
16+
- :ref:`UVC & Mono Camera`
17+
18+
19+
Setup
20+
#####
21+
22+
.. include:: /includes/install_from_pypi.rst
23+
24+
Code used for testing
25+
#####################
26+
27+
.. code-block:: python
28+
29+
import cv2
30+
31+
# Initialize the VideoCapture object to use the default camera (camera index 0 is webcam)
32+
cap = cv2.VideoCapture(1)
33+
34+
# Check if the camera opened successfully
35+
if not cap.isOpened():
36+
print("Error: Could not open camera.")
37+
exit()
38+
39+
# Loop to continuously get frames from the camera
40+
while True:
41+
ret, frame = cap.read()
42+
43+
if not ret:
44+
print("Error: Could not read frame.")
45+
break
46+
47+
cv2.imshow('Video Feed', frame)
48+
49+
if cv2.waitKey(1) & 0xFF == ord('q'):
50+
break
51+
52+
cap.release()
53+
cv2.destroyAllWindows()
54+
55+
56+
Source code
57+
###########
58+
59+
.. tabs::
60+
61+
.. tab:: Python
62+
63+
Also `available on GitHub <https://github.com/luxonis/depthai-python/blob/main/examples/UVC/uvc_disparity.py>`__
64+
65+
.. literalinclude:: ../../../../examples/UVC/uvc_disparity.py
66+
:language: python
67+
:linenos:
68+
69+
70+
.. include:: /includes/footer-short.rst
Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
UVC & Mono Camera
2+
=================
3+
4+
This example demonstrates how to use a mono camera on your OAK device to function as a webcam. The UVC feature allows you to use your OAK device as a regular webcam in applications like OpenCV's :code:`cv2.VideoCapture()`, native camera apps, and more.
5+
6+
.. rubric:: How It Works:
7+
8+
The :ref:`MonoCamera` node outputs image data in the GRAY8 format. However, the :ref:`UVC` node expects the data in NV12 format. To bridge this gap, an intermediary :ref:`ImageManip` node is used to convert the GRAY8 output from the MonoCamera node to NV12 format, which is then passed to the UVC node for streaming.
9+
10+
11+
.. rubric:: Similar samples:
12+
13+
- :ref:`UVC & Color Camera`
14+
- :ref:`UVC & Disparity`
15+
16+
17+
Setup
18+
#####
19+
20+
.. include:: /includes/install_from_pypi.rst
21+
22+
Code used for testing
23+
#####################
24+
25+
.. code-block:: python
26+
27+
import cv2
28+
29+
# Initialize the VideoCapture object to use the default camera (camera index 0 is webcam)
30+
cap = cv2.VideoCapture(1)
31+
32+
# Check if the camera opened successfully
33+
if not cap.isOpened():
34+
print("Error: Could not open camera.")
35+
exit()
36+
37+
# Loop to continuously get frames from the camera
38+
while True:
39+
ret, frame = cap.read()
40+
41+
if not ret:
42+
print("Error: Could not read frame.")
43+
break
44+
45+
cv2.imshow('Video Feed', frame)
46+
47+
if cv2.waitKey(1) & 0xFF == ord('q'):
48+
break
49+
50+
cap.release()
51+
cv2.destroyAllWindows()
52+
53+
54+
55+
Source code
56+
###########
57+
58+
.. tabs::
59+
60+
.. tab:: Python
61+
62+
Also `available on GitHub <https://github.com/luxonis/depthai-python/blob/main/examples/UVC/uvc_mono.py>`__
63+
64+
.. literalinclude:: ../../../../examples/UVC/uvc_mono.py
65+
:language: python
66+
:linenos:
67+
68+
69+
.. include:: /includes/footer-short.rst

0 commit comments

Comments
 (0)