Pretraining requires it's own output to intitalize #9626
-
I can get around this error by setting |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
No, the expectation is that you use a different file for pretraining than you do for training. It's true this might not be clear enough in the documentation. Even in pretraining, you can initialize your tok2vec from another model, and that has a different meaning at pretraining time than it does at training time. |
Beta Was this translation helpful? Give feedback.
No, the expectation is that you use a different file for pretraining than you do for training. It's true this might not be clear enough in the documentation.
Even in pretraining, you can initialize your tok2vec from another model, and that has a different meaning at pretraining time than it does at training time.