Disabling cuda during inference for transformer-ner model #9925
-
I have trained a spacy model with the following components [sentencizer, transformers, ner] in Azure ML Studio using a GPU. When I load and run the model locally I can use the model for inference without any GPU (I don't have one) without any changes.
After which my code stops running. I already tried to set spacy.require_cpu() as suggested in discussion #7622 before calling spacy.load(model), but this does not affect the error at all.
I'm using: |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
This probably just related to how torch was installed. Try uninstalling all packages that start with |
Beta Was this translation helpful? Give feedback.
-
Following this post I downgraded from torch 1.10.1 to 1.9.0 making the error go away. |
Beta Was this translation helpful? Give feedback.
Following this post I downgraded from torch 1.10.1 to 1.9.0 making the error go away.