WARNING: The inference of TFLite model is NOT supported on version 1.1.0-jazzy.
qrb_ros_nn_inference is a ROS2 package for performing neural network model, providing ๐คAI-based perception for robotics applications. It provides:
- โจmodel inference API which supports three kinds of model format: .tflite, .so, .bin
- ๐model inference acceleration based on Qualcomm platforms
qrb_ros_nn_inference is a ROS2 package based on qrb_inference_manager which is a C++ library encapsulates the APIs of Qualcomm AI Engine Direct and QNN Delegate for TensorFlow Lite.
qrb_ros_nn_inference receives data from a specific topic, then without any processing, directly uses the received data for model inference. Afterwards, the results of the model inference are sent out directly through another specific topic.
| Parameter | Type | Default Value | Description |
|---|---|---|---|
| backend_option | string | "" | Hardware acceleration option for model inference, vaild values are listed here |
| model_path | string | "" | Path of model file |
| Topic Name | Message Type | Description |
|---|---|---|
| qrb_inference_input_tensor | TensorList | Subscribed topic |
| qrb_inference_output_tensor | TensorList | Published topic |
Please see qrb_inference_manager APIs.
| Development Hardware | Hardware Overview |
|---|---|
| Qualcomm Dragonwingโข RB3 Gen2 | |
| Qualcomm Dragonwingโข IQ-9075 EVK |
Important
PREREQUISITES: The following steps need to be run on Qualcomm Ubuntu and ROS Jazzy.
Reference Install Ubuntu on Qualcomm IoT Platforms and Install ROS Jazzy to setup environment.
For Qualcomm Linux, please check out the Qualcomm Intelligent Robotics Product SDK documents.
Note: The GPU hardware acceleration is not support on Ubuntu Desktop
Add Qualcomm IOT PPA for Ubuntu:
sudo add-apt-repository ppa:ubuntu-qcom-iot/qcom-noble-ppa
sudo add-apt-repository ppa:ubuntu-qcom-iot/qirp
sudo apt updateInstall Debian package:
sudo apt install ros-jazzy-qrb-ros-nn-inference-
install the qrb_ros_nn_inference by steps above.
-
prepare the pre-process node and post-process node for model inference
# qrb_ros_nn_inference/test includes the pre-process node and post-process node
mkdir -p ~/ros-ws/src && cd ~/ros-ws/src && \
git clone https://github.com/qualcomm-qrb-ros/qrb_ros_nn_inference-
test qrb_ros_nn_inference with YOLOv8 detection model
3.1 Please follow the guides to get yolov8_det.tflite model. For the model run on RB3 gen2, you can run the commands:
python3 -m qai_hub_models.models.yolov8_det.export --target-runtime tflite --device "QCS6490 (Proxy)"3.2 prepare a image for object detecion
mkdir -p ~/ros-ws/src/qrb_ros_nn_inference/test/qrb_ros_pre_process/image/ cp /path/to/image.jpg ~/ros-ws/src/qrb_ros_nn_inference/test/qrb_ros_pre_process/image/ python3 ~/ros-ws/src/qrb_ros_nn_inference/test/qrb_ros_post_process/scripts/yolov8_input_pre_process.py
3.3 point out the raw image path and model path in
~/ros-ws/src/qrb_ros_nn_inference/test/qrb_ros_post_process/launch/nn_node_test.launch.pypre_process_node = ComposableNode( package = "qrb_ros_pre_process", plugin = "qrb_ros::pre_process::QrbRosPreProcessNode", name = "pre_process_node", parameters=[ { "image_path": os.environ['HOME'] + "/ros-ws/src/qrb_ros_nn_inference/test/qrb_ros_pre_process/image/image.raw" } ] ) nn_inference_node = ComposableNode( package = "qrb_ros_nn_inference", plugin = "qrb_ros::nn_inference::QrbRosInferenceNode", name = "nn_inference_node", parameters=[ { "backend_option": "", "model_path": "/path/to/model" } ] )
3.4 build the pre-process node and post-process node
source /opt/ros/jazzy/setup.bash && \ cd ~/ros-ws && \ rm ./src/qrb_ros_nn_inference/test/qrb_ros_post_process/COLCON_IGNORE && \ rm ./src/qrb_ros_nn_inference/test/qrb_ros_pre_process/COLCON_IGNORE && \ colcon build --packages-select qrb_ros_pre_process qrb_ros_post_process
3.5 execute the inference
cd ~/ros-ws && \ source install/local_setup.bash && \ ros2 launch qrb_ros_post_process nn_node_test.launch.py
You can see the result tensor in
~/ros-ws/src/qrb_ros_nn_inference/test/qrb_ros_post_process/inference_result
Install dependencies:
sudo apt install -y software-properties-common
sudo add-apt-repository ppa:ubuntu-qcom-iot/qcom-noble-ppa
sudo apt update
sudo apt install -y libtensorflow-lite-c-qcom1 libtensorflow-lite-qcom-dev libqnn-dev libqnn1Download the source code and build with colcon:
source /opt/ros/jazzy/setup.bash && \
mkdir -p ~/ros-ws/src && \
cd ~/ros-ws/src && \
git clone https://github.com/qualcomm-qrb-ros/qrb_ros_nn_inference && \
git clone https://github.com/qualcomm-qrb-ros/qrb_ros_interfaces && \
cd ~/ros-ws/ && \
colcon build --packages-up-to qrb_ros_nn_inferenceWe love community contributions! Get started by reading our CONTRIBUTING.md. Feel free to create an issue for bug report, feature requests or any discussion.
Project is licensed under the BSD-3-Clause License. See LICENSE for the full license text.

