Skip to content
Discussion options

You must be logged in to vote

Hey, sorry for the delayed reply on this. Short answer is no, Natural Language Generation type tasks are considered out of scope for spaCy.

You could do something using it as a pipeline component. The English Transformer model is already based on RoBERTa, you can look at the source of spacy-transformers for more details on how the difference in tokenization is handled.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by svlandeg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
2 participants