Skip to content
This repository was archived by the owner on Mar 14, 2025. It is now read-only.

Is the inference for int8 the same as for fp16? #36

@WEIZHIHONG720

Description

@WEIZHIHONG720

Hello, thank you for your work!

dd5c2b6110f7bacd7cff148f5db0ca0

Do I have a problem with int8-engine inference this way? I loaded the engine file directly, but didn't use the calibration table, I'm a bit y

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions