Skip to content

Deepstream SDK Python bindings adapted for running YOLO inference on NVIDIA Jetson Nano devices

Notifications You must be signed in to change notification settings

gpavelski/deepstream-run

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

21 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DeepStream GStreamer Pipeline

This project provides a Python-based DeepStream GStreamer pipeline framework for NVIDIA Jetson devices. It supports multiple input sources, inference engines (YOLO, ResNet), and output sinks including RTSP, HLS, EGL display, file, and UDP. The pipeline is fully modular and configurable via command-line options.


Features

  • Flexible Source Options:

    • CSI camera
    • USB camera
    • RTSP streams
    • UDP streams
    • Local video files (.mp4, .mkv, .h264, .h265)
  • Inference Support:

    • GPU-accelerated DeepStream inference via nvinfer
    • No inference
  • Sink Options:

    • EGL display
    • RTSP streaming
    • HLS streaming
    • UDP streaming
    • File output (.mp4, .mkv, .h264)
    • Fake sink for testing
  • Customizable OSD:

    • Bounding boxes
    • Labels with confidence
    • Style management via StyleManager
  • Pipeline Builder:

    • Modular GStreamer element creation
    • Source, core, and sink elements separated for maintainability
    • Dynamic linking of elements

Installation

Prerequisites

  • NVIDIA Jetson device (Nano, Xavier, Orin, etc.)
  • NVIDIA Jetpack (CUDA, CuDNN, TensorRT)
  • NVIDIA DeepStream SDK
  • Python 3.6+
  • GStreamer and dependencies installed
  • Deepstream Python bindings installed (pyds)
  • Deepstream-Yolo libnvdsinfer_custom_impl_Yolo.so built.

Usage

usage: deepstream_run.py [-h] [-i INPUT_URI] [-o OUTPUT_URI] [-c {h264,h265}]
                         [-n] [-p OUTPORT] [-q INPORT] [-H UDP_HOST]
                         [-b BITRATE] [-f CONFIG_FILE] [-g] [-r {1,2,3,4,5,6}]

DeepStream-based video pipeline app

optional arguments:
  -h, --help            show this help message and exit
  -i INPUT_URI, --input INPUT_URI
                        Input source: rtsp://..., file path, 'csi', 'udp' or
                        /dev/videoX
  -o OUTPUT_URI, --output OUTPUT_URI
                        Output sink: 'rtsp', file path, 'hls', 'udp', or leave
                        empty for EGL display
  -c {h264,h265}, --codec {h264,h265}
                        Codec to be used
  -n, --noinfer         Disable Yolo inference
  -p OUTPORT, --port OUTPORT
                        Outbound port number [default: 5000]
  -q INPORT, --inport INPORT
                        Inbound port number [default: 5000]
  -H UDP_HOST, --host UDP_HOST
                        UDP stream host [default: 192.168.137.1]
  -b BITRATE, --bitrate BITRATE
                        Bitrate for UDP/RTSP/HLS outputs [default: 4000000]
  -f CONFIG_FILE, --config CONFIG_FILE
                        DeepStream config file path
  -g, --graph           Generate pipeline .dot file
  -r {1,2,3,4,5,6}, --resolution {1,2,3,4,5,6}
                        Camera resolution mode number: 1: 3264x2464 @ 21 fps
                        2: 3264x1848 @ 28 fps 3: 1920x1080 @ 30 fps 4:
                        1640x1232 @ 30 fps 5: 1280x720 @ 60 fps 6: 1280x720 @
                        120 fps

Installing the DeepStream SDK

The NVIDIA DeepStream SDK is a framework for building AI-powered video analytics pipelines.
On Jetson devices, it can be installed using a pre-compiled Debian package (.deb).

This guide covers installing dependencies, downloading the correct package, and completing the installation.


πŸ“¦ Step 1: Install DeepStream Dependencies

Before installing DeepStream itself, install the required libraries and development tools for GStreamer, security, and supporting components:

sudo apt install libssl-dev libgles2-mesa-dev libgstreamer1.0-0 gstreamer1.0-tools \
gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \
gstreamer1.0-libav libgstreamer-plugins-base1.0-dev libgstrtspserver-1.0-0 \
libgstrtspserver-1.0-dev gstreamer1.0-rtsp v4l-utils libjansson4 libyaml-cpp-dev \
libjsoncpp-dev protobuf-compiler gcc make git python3

πŸ“₯ Step 2: Download the Correct DeepStream Package

The DeepStream package is tied to your JetPack version and Jetson board.

  • Jetson Orin Nano NX (JetPack 6.1.x, t234, r36.4, Ubuntu 22.04):
wget https://repo.download.nvidia.com/jetson/common/pool/main/d/deepstream-7.1/deepstream-7.1_7.1.0-1_arm64.deb
  • Jetson Orin Nano (JetPack 5.1.x, t234, r35.2, Ubuntu 20.04):
wget https://repo.download.nvidia.com/jetson/common/pool/main/d/deepstream-6.3/deepstream-6.3_6.3.0-1_arm64.deb
  • Legacy Jetson Nano (JetPack 4.6.x, t210, Ubuntu 18.04):
wget https://repo.download.nvidia.com/jetson/common/pool/main/d/deepstream-6.0/deepstream-6.0_6.0.1-1_arm64.deb

βš™οΈ Step 3: Install the DeepStream Package

Once downloaded, install the .deb file with dpkg. Use the correct filename for your Jetson board:

  • Jetson Orin Nano NX:
sudo dpkg -i deepstream-7.1_7.1.0-1_arm64.deb
  • Jetson Orin Nano:
sudo dpkg -i deepstream-6.3_6.3.0-1_arm64.deb
  • Legacy Jetson Nano:
sudo dpkg -i deepstream-6.0_6.0.1-1_arm64.deb

βœ… Installation Complete

If the installation completes without errors, the DeepStream SDK is now available on your Jetson device. You are ready to build and run YOLO-based perception applications or other AI-powered video analytics pipelines.

Building the DeepStream Python Bindings

The NVIDIA DeepStream SDK provides a powerful toolkit for building AI-powered video analytics applications.
While the core SDK is written in C/C++, Python bindings (pyds) are available so developers can leverage DeepStream with Python’s simplicity and ecosystem.

These bindings are not installed by default and must be compiled from source.
This guide walks you through building and installing the DeepStream Python bindings on a Jetson board.


πŸ“¦ Prerequisites and Dependency Installation

Before compiling the bindings, install the required development tools, libraries, and Python packages.
The exact dependencies differ slightly depending on the device.

Jetson Orin Nano NX (Ubuntu 22.04, Python 3.10)

sudo apt install python3-gi python3-dev python3-gst-1.0 python-gi-dev git meson \
    python3 python3-pip python3-venv cmake g++ build-essential libglib2.0-dev \
    libglib2.0-dev-bin libgstreamer1.0-dev libtool m4 autoconf automake libgirepository-2.0-dev libcairo2-dev

Additional dependencies:

python3 -m pip install build cuda-python==<CUDA_VERSION>

Jetson Orin Nano NX (Ubuntu 20.04, Python 3.8)

sudo apt install python3-gi python3-dev python3-gst-1.0 python-gi-dev git \
python3 python3-pip python3.8-dev cmake g++ build-essential libglib2.0-dev \
libglib2.0-dev-bin libgstreamer1.0-dev libtool m4 autoconf automake \
libgirepository1.0-dev libcairo2-dev 

Legacy Jetson Nano (Ubuntu 18.04, Python 3.6)

sudo apt install -y git python-dev python3 python3-pip python3.6-dev python3.8-dev \
cmake g++ build-essential libglib2.0-dev libglib2.0-dev-bin python-gi-dev \
libtool m4 autoconf automake

⬆️ Upgrading Pip

Upgrade pip to avoid package installation issues:

python3 -m pip install --upgrade pip

πŸ“‚ Cloning the DeepStream Python Apps Repository

Clone the official NVIDIA repository and move it into the DeepStream sources directory:

git clone https://github.com/NVIDIA-AI-IOT/deepstream_python_apps
sudo mv deepstream_python_apps/ /opt/nvidia/deepstream/deepstream/sources/

πŸ”§ Initializing Submodules and Updating Certificates

Navigate to the repo:

cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/

Checkout the correct version:

  • Jetson Orin Nano NX (DeepstreamSDK 7.1)

    git checkout v1.2.0
  • Jetson Orin Nano NX (DeepstreamSDK 6.3)

    git checkout v1.1.8
  • Legacy Jetson Nano (Deepstream 6.0.1)

    git checkout v1.1.1

Initialize submodules:

git submodule update --init

For the newer releases, also run:

python3 bindings/3rdparty/git-partial-submodule/git-partial-submodule.py restore-sparse

Install tools and update certificates:

sudo apt-get install apt-transport-https ca-certificates -y
sudo update-ca-certificates

πŸ› οΈ Building and Installing gst-python

gst-python must be built before compiling the main bindings.

Prior to Deepstream 7.X:

cd 3rdparty/gst-python/
./autogen.sh
make
sudo make install

New approach (7.0 and later):

cd bindings/3rdparty/gstreamer/subprojects/gst-python/
meson setup build
cd build
ninja
ninja install

βš™οΈ Compiling the DeepStream Python Bindings (pyds)

Navigate to the bindings directory:

cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/bindings

This generates a Python wheel (.whl) file in the dist directory, e.g.:

  • pyds-1.2.0-cp310-cp310-linux_aarch64.whl (Orin Nano NX)

  • Jetson Orin Nano NX (Python 3.10)

    export CMAKE_BUILD_PARALLEL_LEVEL=$(nproc)
    python3 -m build
  • Jetson Orin Nano NX (Python 3.8)

Configure the build with CMake:

cmake -S . -B build -DPYTHON_MAJOR_VERSION=3 -DPYTHON_MINOR_VERSION=8 -DPIP_PLATFORM=linux_aarch64
  • Legacy Jetson Nano (Python 3.6)

    cmake -S . -B build -DPYTHON_MAJOR_VERSION=3 -DPYTHON_MINOR_VERSION=6 -DPIP_PLATFORM=linux_aarch64

Build:

cmake --build build

This generates a Python wheel (.whl) file in the build directory, e.g.:

  • pyds-1.1.8-py3-none-linux_aarch64.whl (Orin Nano NX)
  • pyds-1.1.1-py3-none-linux_aarch64.whl (Legacy Nano)

πŸ“₯ Installing the Compiled Bindings

Install the generated wheel file with pip3:

  • Orin Nano NX (PyDS v1.2.0)

    pip3 install /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/bindings/dist/pyds-1.2.0-cp310-cp310-linux_aarch64.whl
  • **Orin Nano NX (PyDS v1.1.8) **

    pip3 install /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/bindings/build/pyds-1.1.8-py3-none-linux_aarch64.whl
  • **Legacy Nano (PyDS v1.1.1) **

    pip3 install /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/bindings/build/pyds-1.1.1-py3-none-linux_aarch64.whl

βœ… Verifying the Installation

Run Python and import pyds:

python3
import pyds
print(pyds.__version__)  # optional

If no errors appear, the DeepStream Python bindings were successfully installed. You can now explore the sample apps in:

/opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/

Building the DeepStream-Yolo Library

The NVIDIA DeepStream SDK is a powerful framework for video analytics, but it does not natively include parsers for all object detection models.

The DeepStream-Yolo project is a widely used third-party library that provides a custom implementation, enabling DeepStream to perform inference with different versions of the YOLO (You Only Look Once) model architecture.

This guide walks you through cloning, configuring, and compiling the library.


πŸ“¦ Prerequisites

Before proceeding, ensure that:

  • The NVIDIA DeepStream SDK is installed on your Jetson board.
  • Build tools (make, g++, etc.) are installed (already covered in DeepStream Python Bindings).
  • You have an active internet connection to clone the GitHub repository.

πŸŒ€ Step 1: Clone the DeepStream-Yolo Repository

Navigate to your desired working directory and clone the official repository:

cd ~
git clone https://github.com/marcoslucianops/DeepStream-Yolo.git
cd DeepStream-Yolo

βš™οΈ Step 2: Set the CUDA Version Environment Variable

The build process requires the CUDA version installed with your DeepStream SDK. You must export an environment variable named CUDA_VER.

  • **Jetson Orin Nano NX (Jetpack 6.1.x) ** β†’ CUDA 12.6
  • Jetson Orin Nano NX (Jetpack 5.1.x) β†’ CUDA 11.4
  • Legacy Jetson Nano (Jetpack 4.6.x) β†’ CUDA 10.2

Run the appropriate command:

# For new Jetson Orin Nano NX
export CUDA_VER=12.6

# For Jetson Orin Nano NX
export CUDA_VER=11.4

# For Legacy Jetson Nano
export CUDA_VER=10.2

⚠️ Note: This setting only applies to the current terminal session. If you open a new terminal, you will need to re-run the export command.


πŸ”¨ Step 3: Compile the Library

Run the following command to clean and compile the library:

make -C nvdsinfer_custom_impl_Yolo clean && make -C nvdsinfer_custom_impl_Yolo

This does two things:

  1. make -C nvdsinfer_custom_impl_Yolo clean β†’ Removes old build artifacts.
  2. make -C nvdsinfer_custom_impl_Yolo β†’ Compiles and links the source into a shared library.

βœ… Step 4: Verify the Build

After compilation, verify that the shared library exists:

ls -l nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so

Example output:

-rwxrwxr-x 1 jetson jetson 1202552 Mar 14 16:10 nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so

The file:

nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so

is the compiled shared library required by DeepStream to correctly parse YOLO model outputs.

You may copy the file libnvdsinfer_custom_impl_Yolo.so generated by the DeepStream-Yolo build to the yolo sub-folder of deepstream-run:

cp ~/DeepStream-Yolo/nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so yolo/

About

Deepstream SDK Python bindings adapted for running YOLO inference on NVIDIA Jetson Nano devices

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors