-
Notifications
You must be signed in to change notification settings - Fork 43
Open
Description
Here's the steps I followed to export the .onnx file as TensorRT .engine format. I presume this will work for all the TensorRT 10 versions.
- Git clone or download the TensorRT 10 version from the repo.
- Copy the common and the vc folders located inside TensorRT plugin folder into FastFlowNet's tensorrt_workspace/TensorRT/plugin folder
- Replace the correlation plugin and gridsampler plugin and the CMakeLists.txt with these files
- Build the plugins
- Use trtexec command with the built .so plugins argument to build the engine from .onnx file.
Note - I generated the .onnx file using the docker environment mentioned in the project's README. I think it doesn't work with the latest onnx version. Try it with onnx==1.8.1 and onnx-graphsurgeon==0.3.20. The compatible python environment is version 3.6.9.
Example
# workingdir->FastFlowNet/tensorrt_workspace/plugin
cd tensorrt_workspace/TensorRT/plugin
# clone tensorrt
git clone -b release/10.9 https://github.com/NVIDIA/TensorRT.git
# copy the common and vc folders
cp TensorRT/plugin/common .
cp TensorRT/plugin/vc .
# Replace the existing files (Step 3)
# build the plugins
mkdir build && cd build
cmake ..
makeMetadata
Metadata
Assignees
Labels
No labels