spaCy inference without pytorch dependency? #10271
-
Hi, Is there any way to use spaCy pretrained models for inference (specifically on lemmatization/part-of-speech tagging) without pytorch dependencies (e.g. Onnx models or torchscript)? I have a containerized app using a docker image, and torch + other dependencies cause it to be 1.8GB and if I could remove pytorch it'd be ~200MB. I tried searching discussions and issues for jit, torchscript, onnx, and pytorch dependency and found nothing. Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Torch is only required if you're using spaCy transformers. If you use the sm/md/lg models you don't need it. |
Beta Was this translation helpful? Give feedback.
Torch is only required if you're using spaCy transformers. If you use the sm/md/lg models you don't need it.