Skip to content

I have a different result using TensorRT10 and TensorRT8 for inference. What is the reason for this? #4539

@hanliliy

Description

@hanliliy

Windows 10 system, I use a onnx model, they convert TensorRT model using TensorRT10.12 and TensorRT 8.6,then using trtexec.exe inference same image, but the results are different, what's the reason for this?

Convert: trtexec.exe --onnx=iter.onnx --saveEngine=iter_static.engine

Inference: trtexec.exe --loadEngine=iter_static.engine --loadInputs='input':input.bin --exportOutput=trtexec-result.json

Metadata

Metadata

Assignees

Labels

Module:AccuracyOutput mismatch between TensorRT and other frameworkstriagedIssue has been triaged by maintainers

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions