Skip to content

Commit d436280

Browse files
Merge pull request #273 from luxonis/sync_python_cpp_2
Sync python-cpp examples v2
2 parents b1504d6 + 10a5345 commit d436280

File tree

10 files changed

+194
-7
lines changed

10 files changed

+194
-7
lines changed

docs/source/samples/depth_preview.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,17 @@ This example shows how to set the SGBM (semi-global-matching) disparity-depth no
55
over XLink to transfer the results to the host real-time, and displays the depth map in OpenCV.
66
Note that disparity is used in this case, as it colorizes in a more intuitive way.
77
Below is also a preview of using different median filters side-by-side on a depth image.
8+
There are 3 depth modes which you can select inside the code:
9+
10+
#. `lr_check`: used for better occlusion handling. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#left-right-check-depth-mode>`__
11+
#. `extended_disparity`: suitable for short range objects. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#extended-disparity-depth-mode>`__
12+
#. `subpixel`: suitable for long range. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#subpixel-disparity-depth-mode>`__
813

914
.. rubric:: Similiar samples:
1015

1116
- :ref:`RGB Preview`
1217
- :ref:`Mono Preview`
18+
- :ref:`Stereo Depth Video`
1319

1420
Demo
1521
####

docs/source/samples/object_tracker_video.rst

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,4 +33,12 @@ Source code
3333
:language: python
3434
:linenos:
3535

36+
.. tab:: C++
37+
38+
Also `available on GitHub <https://github.com/luxonis/depthai-core/blob/main/examples/src/object_tracker_video.cpp>`__
39+
40+
.. literalinclude:: ../../../depthai-core/examples/src/object_tracker_video.cpp
41+
:language: cpp
42+
:linenos:
43+
3644
.. include:: /includes/footer-short.rst

docs/source/samples/stereo_depth_from_host.rst

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,18 @@
11
Stereo Depth from host
22
======================
33

4-
This example shows depth map from host using stereo images. There are 3 depth modes which you can select
5-
inside the code: left-right check, extended (for closer distance), subpixel (for longer distance).
4+
This example shows depth map from host using stereo images. There are 3 depth modes which you can select inside the code:
5+
6+
#. `lr_check`: used for better occlusion handling. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#left-right-check-depth-mode>`__
7+
#. `extended_disparity`: suitable for short range objects. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#extended-disparity-depth-mode>`__
8+
#. `subpixel`: suitable for long range. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#subpixel-disparity-depth-mode>`__
9+
610
Otherwise a median with kernel_7x7 is activated.
711

12+
.. rubric:: Similiar samples:
13+
14+
- :ref:`Stereo Depth Video`
15+
816
Setup
917
#####
1018

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
Stereo Depth Video
2+
==================
3+
4+
This example is an upgraded :ref:`Depth Preview`. It has higher resolution (720p), each frame can be shown
5+
(mono left-right, rectified left-right, disparity and depth). There are 6 modes which you can select
6+
inside the code:
7+
8+
#. `withDepth`: if you turn it off it will became :ref:`Mono Preview`, so it will show only the 2 mono cameras
9+
#. `outputDepth`: if you turn it on it will show the depth
10+
#. `lrcheck`: used for better occlusion handling. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#left-right-check-depth-mode>`__
11+
#. `extended`: suitable for short range objects. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#extended-disparity-depth-mode>`__
12+
#. `subpixel`: suitable for long range. For more information `click here <https://docs.luxonis.com/en/latest/pages/faq/#subpixel-disparity-depth-mode>`__
13+
14+
.. rubric:: Similiar samples:
15+
16+
- :ref:`Depth Preview`
17+
- :ref:`Stereo Depth from host`
18+
19+
Setup
20+
#####
21+
22+
.. include:: /includes/install_from_pypi.rst
23+
24+
Source code
25+
###########
26+
27+
.. tabs::
28+
29+
.. tab:: Python
30+
31+
Also `available on GitHub <https://github.com/luxonis/depthai-python/blob/main/examples/stereo_depth_video.py>`__
32+
33+
.. literalinclude:: ../../../examples/stereo_depth_video.py
34+
:language: python
35+
:linenos:
36+
37+
.. tab:: C++
38+
39+
Also `available on GitHub <https://github.com/luxonis/depthai-core/blob/main/examples/src/stereo_depth_video.cpp>`__
40+
41+
.. literalinclude:: ../../../depthai-core/examples/src/stereo_depth_video.cpp
42+
:language: cpp
43+
:linenos:
44+
45+
.. include:: /includes/footer-short.rst

docs/source/tutorials/code_samples.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,7 @@ Code samples are used for automated testing. They are also a great starting poin
3535
- :ref:`Mono Camera Control` - Demonstrates how to control the mono camera (crop, exposure, sensitivity) from the host
3636
- :ref:`Depth Crop Control` - Demonstrates how to control cropping of depth frames from the host
3737
- :ref:`Stereo Depth from host` - Generates stereo depth frame from a set of mono images from the host
38+
- :ref:`Stereo Depth Video` - An extended version of **Depth Preview**
3839
- :ref:`RGB Rotate Warp` - Demonstrates how to rotate, mirror, flip or perform perspective transform on a frame
3940
- :ref:`RGB Depth` - Displays RGB depth frames
4041
- :ref:`Auto Exposure on ROI` - Demonstrates how to use auto exposure based on the selected ROI

docs/source/tutorials/complex_samples.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ Complex
1010
../samples/mono_camera_control.rst
1111
../samples/depth_crop_control.rst
1212
../samples/stereo_depth_from_host.rst
13+
../samples/stereo_depth_video.rst
1314
../samples/rgb_rotate_warp.rst
1415
../samples/rgb_depth_aligned.rst
1516
../samples/autoexposure_roi.rst
@@ -34,6 +35,7 @@ If you want to see more interesting examples you should check out our `Experimen
3435
- :ref:`Mono Camera Control` - Demonstrates how to control the mono camera (crop, exposure, sensitivity) from the host
3536
- :ref:`Depth Crop Control` - Demonstrates how to control cropping of depth frames from the host
3637
- :ref:`Stereo Depth from host` - Generates stereo depth frame from a set of mono images from the host
38+
- :ref:`Stereo Depth Video` - An extended version of **Depth Preview**
3739
- :ref:`RGB Rotate Warp` - Demonstrates how to rotate, mirror, flip or perform perspective transform on a frame
3840
- :ref:`RGB Depth` - Displays RGB depth frames
3941
- :ref:`Auto Exposure on ROI` - Demonstrates how to use auto exposure based on the selected ROI

examples/CMakeLists.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -118,3 +118,4 @@ add_python_example(object_tracker object_tracker.py)
118118
add_python_example(spatial_object_tracker spatial_object_tracker.py)
119119
add_python_example(object_tracker_video object_tracker_video.py)
120120
add_python_example(stereo_depth_from_host stereo_depth_from_host.py)
121+
add_python_example(stereo_depth_video stereo_depth_video.py)

examples/object_tracker_video.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -77,10 +77,10 @@
7777
with dai.Device(pipeline) as device:
7878

7979
qIn = device.getInputQueue(name="inFrame")
80-
trackerFrameQ = device.getOutputQueue("trackerFrame", 4)
81-
tracklets = device.getOutputQueue("tracklets", 4)
82-
qManip = device.getOutputQueue("manip", maxSize=4)
83-
qDet = device.getOutputQueue("nn", maxSize=4)
80+
trackerFrameQ = device.getOutputQueue(name="trackerFrame", maxSize=4)
81+
tracklets = device.getOutputQueue(name="tracklets", maxSize=4)
82+
qManip = device.getOutputQueue(name="manip", maxSize=4)
83+
qDet = device.getOutputQueue(name="nn", maxSize=4)
8484

8585
startTime = time.monotonic()
8686
counter = 0

examples/stereo_depth_video.py

Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,116 @@
1+
#!/usr/bin/env python3
2+
3+
import cv2
4+
import depthai as dai
5+
import numpy as np
6+
7+
withDepth = True
8+
9+
outputDepth = False
10+
outputRectified = True
11+
lrcheck = True
12+
extended = False
13+
subpixel = False
14+
15+
# Create pipeline
16+
pipeline = dai.Pipeline()
17+
18+
# Define sources and outputs
19+
monoLeft = pipeline.createMonoCamera()
20+
monoRight = pipeline.createMonoCamera()
21+
if withDepth:
22+
stereo = pipeline.createStereoDepth()
23+
24+
xoutLeft = pipeline.createXLinkOut()
25+
xoutRight = pipeline.createXLinkOut()
26+
xoutDisp = pipeline.createXLinkOut()
27+
xoutDepth = pipeline.createXLinkOut()
28+
xoutRectifL = pipeline.createXLinkOut()
29+
xoutRectifR = pipeline.createXLinkOut()
30+
31+
# XLinkOut
32+
xoutLeft.setStreamName("left")
33+
xoutRight.setStreamName("right")
34+
if withDepth:
35+
xoutDisp.setStreamName("disparity")
36+
xoutDepth.setStreamName("depth")
37+
xoutRectifL.setStreamName("rectified_left")
38+
xoutRectifR.setStreamName("rectified_right")
39+
40+
# Properties
41+
monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)
42+
monoLeft.setBoardSocket(dai.CameraBoardSocket.LEFT)
43+
monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)
44+
monoRight.setBoardSocket(dai.CameraBoardSocket.RIGHT)
45+
46+
if withDepth:
47+
# StereoDepth
48+
stereo.setConfidenceThreshold(200)
49+
stereo.setRectifyEdgeFillColor(0) # black, to better see the cutout
50+
# stereo.setInputResolution(1280, 720)
51+
stereo.setMedianFilter(dai.StereoDepthProperties.MedianFilter.MEDIAN_OFF)
52+
stereo.setLeftRightCheck(lrcheck)
53+
stereo.setExtendedDisparity(extended)
54+
stereo.setSubpixel(subpixel)
55+
56+
# Linking
57+
monoLeft.out.link(stereo.left)
58+
monoRight.out.link(stereo.right)
59+
60+
stereo.syncedLeft.link(xoutLeft.input)
61+
stereo.syncedRight.link(xoutRight.input)
62+
stereo.disparity.link(xoutDisp.input)
63+
64+
if outputRectified:
65+
stereo.rectifiedLeft.link(xoutRectifL.input)
66+
stereo.rectifiedRight.link(xoutRectifR.input)
67+
68+
if outputDepth:
69+
stereo.depth.link(xoutDepth.input)
70+
71+
else:
72+
# Link plugins CAM . XLINK
73+
monoLeft.out.link(xoutLeft.input)
74+
monoRight.out.link(xoutRight.input)
75+
76+
# Connect to device and start pipeline
77+
with dai.Device(pipeline) as device:
78+
79+
leftQueue = device.getOutputQueue(name="left", maxSize=8, blocking=False)
80+
rightQueue = device.getOutputQueue(name="right", maxSize=8, blocking=False)
81+
if (withDepth):
82+
dispQueue = device.getOutputQueue(name="disparity", maxSize=8, blocking=False)
83+
depthQueue = device.getOutputQueue(name="depth", maxSize=8, blocking=False)
84+
rectifLeftQueue = device.getOutputQueue(name="rectified_left", maxSize=8, blocking=False)
85+
rectifRightQueue = device.getOutputQueue(name="rectified_right", maxSize=8, blocking=False)
86+
87+
# Disparity range is used for normalization
88+
disparityMultiplier = 255 / stereo.getMaxDisparity()
89+
90+
while True:
91+
left = leftQueue.get()
92+
cv2.imshow("left", left.getFrame())
93+
right = rightQueue.get()
94+
cv2.imshow("right", right.getFrame())
95+
96+
if withDepth:
97+
disparity = dispQueue.get()
98+
disp = disparity.getCvFrame()
99+
disp = (disp*disparityMultiplier).astype(np.uint8) # Extended disparity range
100+
cv2.imshow("disparity", disp)
101+
disp = cv2.applyColorMap(disp, cv2.COLORMAP_JET)
102+
cv2.imshow("disparity_color", disp)
103+
104+
if outputDepth:
105+
depth = depthQueue.get()
106+
cv2.imshow("depth", depth.getCvFrame().astype(np.uint16))
107+
108+
if outputRectified:
109+
rectifL = rectifLeftQueue.get()
110+
cv2.imshow("rectified_left", rectifL.getFrame())
111+
112+
rectifR = rectifRightQueue.get()
113+
cv2.imshow("rectified_right", rectifR.getFrame())
114+
115+
if cv2.waitKey(1) == ord('q'):
116+
break

0 commit comments

Comments
 (0)