Skip to content
Discussion options

You must be logged in to vote

Answers from previous posts.

  1. Originally posted by @polm in #10543 (comment)

    Transformers should run on CPU, and if they don't that could be a bug, though in most cases they will be too slow to be useful.

  2. Originally posted by @spatiebalk in #9925

    I have trained a spacy model with the following components [sentencizer, transformers, ner] in Azure ML Studio using a GPU. When I load and run the model locally I can use the model for inference without any GPU (I don't have one) without any changes.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by thomazmoon
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
gpu Using spaCy on GPU feat / transformer Feature: Transformer
1 participant