ValueError("[E203] If the tok2vec embedding layer is not updated during training, make sure to include it in 'annotating components'") #12822
-
after the second batch of my data , i m getting this error : my output : =========================== Initializing pipeline =========================== ============================= Training pipeline ============================= 0 0 0.00 69.53 10.17 7.89 14.29 0.10 =========================== Initializing pipeline =========================== ============================= Training pipeline ============================= ERROR:main:Failed to execute task: [E203] If the tok2vec embedding layer is not updated during training, make sure to include it in 'annotating components' |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Hi! I moved your post to the discussions forum. From the output, there's a warning that you should pay attention to:
And, from the exception message:
It that doesn't resolve things, can you post your full config file and use proper markdown formatting in your posts so they are easier to read on our end? |
Beta Was this translation helpful? Give feedback.
-
Hi, I'm getting the same problem. I've added |
Beta Was this translation helpful? Give feedback.
I would highly recommend switching to a command-line approach to train your models, and using the
spacy train
command, because that command takes care of a lot of additional details that you'll have to code for manually if you want to callnlp.update
directly.That said, the error message in your original post refers to a
tok2vec
component, not atransformer
component. You must have aTok2VecListener
in your pipeline that requires the annotations of atok2vec
component that has not been added toannotates
.