❓ How to save metadata and use use torch inferencer #1633
-
What is the motivation for this task?provide a simple guidline to use TorchInferencer Describe the solution you'd likeload model and sample images Additional contextNo response |
Beta Was this translation helpful? Give feedback.
Answered by
blaz-r
Nov 17, 2023
Replies: 1 comment
-
You need to set export in yaml to torch: optimization:
export_mode: torch # options: torch, onnx, openvino Then you use the inferencer like this: # To get help about the arguments, run:
python tools/inference/torch_inference.py --help
# Example Torch inference command:
python tools/inference/torch_inference.py \
--weights results/padim/mvtec/bottle/run/weights/torch/model.pt \
--input datasets/MVTec/bottle/test/broken_large/000.png \
--output results/padim/mvtec/bottle/images``` |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
samet-akcay
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
You need to set export in yaml to torch:
Then you use the inferencer like this: