Skip to content

TensorRT export with version 10.9 #30

@rohankhaire-work

Description

@rohankhaire-work

Here's the steps I followed to export the .onnx file as TensorRT .engine format. I presume this will work for all the TensorRT 10 versions.

  1. Git clone or download the TensorRT 10 version from the repo.
  2. Copy the common and the vc folders located inside TensorRT plugin folder into FastFlowNet's tensorrt_workspace/TensorRT/plugin folder
  3. Replace the correlation plugin and gridsampler plugin and the CMakeLists.txt with these files
  4. Build the plugins
  5. Use trtexec command with the built .so plugins argument to build the engine from .onnx file.

Note - I generated the .onnx file using the docker environment mentioned in the project's README. I think it doesn't work with the latest onnx version. Try it with onnx==1.8.1 and onnx-graphsurgeon==0.3.20. The compatible python environment is version 3.6.9.

Example

# workingdir->FastFlowNet/tensorrt_workspace/plugin
cd tensorrt_workspace/TensorRT/plugin

# clone tensorrt
git clone -b release/10.9 https://github.com/NVIDIA/TensorRT.git

# copy the common and vc folders
cp TensorRT/plugin/common .
cp TensorRT/plugin/vc .

# Replace the existing files (Step 3)

# build the plugins
mkdir build && cd build
cmake ..
make

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions