-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Description
Description
I am trying to run tensorrt samples included in the zip file i downloaded from here .
I was able to run succesfully sampleOnnxMNIST, opening the solution with VS2022 and building the project from there.
I then tried to run python sample yolov3_onnx, without success.
Environment
TensorRT Version: 10.5.0.18
NVIDIA GPU: Quadro T1000
NVIDIA Driver Version: 553.50
CUDA Version: 12.2
CUDNN Version: 8.9
Operating System: Windows 11 Pro 23H2
Python Version (if applicable): 3.11.9
Relevant Files
Tensorrt version download link: https://developer.nvidia.com/downloads/compute/machine-learning/tensorrt/10.5.0/zip/TensorRT-10.5.0.18.Windows.win10.cuda-12.6.zip
Steps To Reproduce
I set up python virtual environment, installing alla the requirements.
I managed to convert .weights file into onnx format with python .\yolov3_to_onnx.py -d .
Launching the command python .\onnx_to_tensorrt.py -d . an error occurs which i did not find a solution to. Traceback is shown below:
[01/14/2025-11:25:28] [TRT] [E] createInferBuilder: Error Code 6: API Usage Error (Unable to load library: nvinfer_builder_resource_10.dll)
Traceback (most recent call last):
File "E:\Libraries\TensorRT-10.5.0.18\samples\python\yolov3_onnx\onnx_to_tensorrt.py", line 211, in <module>
main()
File "E:\Libraries\TensorRT-10.5.0.18\samples\python\yolov3_onnx\onnx_to_tensorrt.py", line 148, in main
with get_engine(
^^^^^^^^^^^
File "E:\Libraries\TensorRT-10.5.0.18\samples\python\yolov3_onnx\onnx_to_tensorrt.py", line 124, in get_engine
return build_engine()
^^^^^^^^^^^^^^
File "E:\Libraries\TensorRT-10.5.0.18\samples\python\yolov3_onnx\onnx_to_tensorrt.py", line 77, in build_engine
with trt.Builder(TRT_LOGGER) as builder, builder.create_network(
^^^^^^^^^^^^^^^^^^^^^^^
TypeError: pybind11::init(): factory function returned nullptr
I also tryed to put nvinfer_builder_resource_10.dll into the same directory as python files, but nothing changed.
Can you please help me?
Thank you,
Davide