Windows 10 system, I use a onnx model, they convert TensorRT model using TensorRT10.12 and TensorRT 8.6,then using trtexec.exe inference same image, but the results are different, what's the reason for this?
Convert: trtexec.exe --onnx=iter.onnx --saveEngine=iter_static.engine
Inference: trtexec.exe --loadEngine=iter_static.engine --loadInputs='input':input.bin --exportOutput=trtexec-result.json