Transformer inference on CPU #10543
-
|
As you know, the documentation recommends having GPU and running
I was wondering if it is possible to |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
|
Transformers should run on CPU, and if they don't that could be a bug, though in most cases they will be too slow to be useful. To be clear, are you getting that |
Beta Was this translation helpful? Give feedback.
Transformers should run on CPU, and if they don't that could be a bug, though in most cases they will be too slow to be useful.
To be clear, are you getting that
ValueErrorwhen runningprefer_gpu()on a machine with no GPU? Or when runningspacy.load()? Please provide the actual code / configuration you are using so we can understand the issue.