-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Description
Description
I attempted to build TensorRT 10.12 (and 10.11) against onnx 1.18.0. I used a patch to link against my system onnx instead of the bundled one, but I don't believe that should have caused the failure. With onnx 1.18.0, default visibility has been set to hidden on almost all symbols. I spent the last couple of days trying to patch the code to export what was needed, but the number became overwhelming and I ran into an issue with protobuf-generated code that I couldn't directly apply the necessary ONNX_API visibility fix to. In the end, I had to modify the CMakeLists.txt to use default visibility instead of hidden, and I was finally able to compile onnx-graphsurgeon. I don't think the onnx folks are going to want that as an answer, so I figured I'd like y'all take a look at onnx-graphsurgeon and see if you can determine the best way to put a real fix together.
Environment
TensorRT Version: 10.12
NVIDIA GPU: RTX 4060
NVIDIA Driver Version: 570.153.02
CUDA Version: 12.9.1
CUDNN Version: 9.10.1.14
Operating System: Slackware -current
Python Version (if applicable): 3.12.11
Tensorflow Version (if applicable): N/A
PyTorch Version (if applicable): 2.7.1
Baremetal or Container (if so, version): N/A
Relevant Files
Model link: N/A
Steps To Reproduce
Download and unpack TensorRt tarball from this repo
Rename directory to match GA version from NVidia (TensorRT-10.12.0.36 in my case)
Download and unpack GA tarball to same directory (TensorRT-10.12.0.36)
create build directory in same directory and change to that directory (TensorRT-10.12.0.36/build)
cmake .. (with appropriate build flags)
make
../python/build.sh (with appropriate environment variables)
cd ../tools/onnx-graphsurgeon
python -m build --wheel --no-isolation
The build will fail due to an undefined symbol in onnx_cpp2py_export.so (or onnx_cpp2py_export.cpython-312-x86_64-linux-gnu.so on my machine)
Patching onnx to export that symbol using ONNX_API will just lead to another build failure with a different symbol, and so on
Commands or scripts: N/A
Have you tried the latest release?: Yes
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (polygraphy run <model.onnx> --onnxrt): N/A