Skip to content
Discussion options

You must be logged in to vote

If you intend to continue pre-training your models on unannotated data, I would suggest using transformers instead - we unfortunately do not support this at the moment.

Replies: 3 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@thiagoms-tc
Comment options

Answer selected by vinbo8
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / tok2vec Feature: Token-to-vector layer and pretraining feat / transformer Feature: Transformer
2 participants