Skip to content

SAM3 onnx infer no result #234

@demo-telo

Description

@demo-telo

Here is my process:

  • First, I used the transform library to convert sam3.pt(sam3 offi) into the Transform library's format.
    result like:

Image

-> Second,I'm used the "exportv2.py" file you provided to get onnx-format model, the parameters follow those provided in your example.
like:
--all --model-path /home/ubuntu/program/usls-main/checkpoints/converted_sam3_model --output-dir /home/ubuntu/program/usls-main/onnx-models-v2 --device cuda --image-height 1008 --image-width 1008
result(no error to report):
Image

->But,when I ues “inference_v2.py” to test the onnx-format model, it get on result in kid.jpg using "shoe" prompt. I have noticed the name of onnx-format model used in “inference_v2.py” is not the same with "exportv2.py"'s result.
like:
Image.
My command like:
"python3 /home/ubuntu/program/usls-main/scripts/sam3-image/inference_v2.py --image ../../assets --text kids --model-dir /home/ubuntu/program/usls-main/onnx-models-v2 --tokenizer /home/ubuntu/program/usls-main/checkpoints/converted_sam3_model/tokenizer.json --output output-text-v2.png --device cpu --image-height 1008 --image-width 1008 "

Besides, I have tried the v1 version process, but it didn't work either.

So, where might the problem lie? I hope you can guide me. Thank you.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions