Skip to content
Discussion options

You must be logged in to vote

it is supposed to be the case that the same config file can be used for pretraining and training, no?

No, the expectation is that you use a different file for pretraining than you do for training. It's true this might not be clear enough in the documentation.

Even in pretraining, you can initialize your tok2vec from another model, and that has a different meaning at pretraining time than it does at training time.

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@polm
Comment options

@polm
Comment options

@kchalkSGS
Comment options

@polm
Comment options

Answer selected by polm
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / tok2vec Feature: Token-to-vector layer and pretraining
2 participants