Transformers/Tok2Vec and listeners #8681
Unanswered
BramVanroy
asked this question in
Help: Other Questions
Replies: 1 comment
-
As an update for the last question: I found an answer in the docs. It's here if you scroll down a bit. If I'm correct, you can do something like
This should load the tok2vec layer from the en_core_web_sm model and use it as-is. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi
I am looking into training my own models and I have a question about the interaction between the Tok2Vec component and listeners. I have read the documentation here.
The Tok2Vec is always connected to its listener(s), also in the backward pass. If we train a tok2vec layer from-scratch, that makes sense because the Tok2Vec layer needs to be "learned". But using pretrained Transformers this is less straight-forward and it is not clear to me in how far Transformers operate within such a training pipeline:
Thanks! (And apologies in advance for other questions that I will undoubtedly have.)
Beta Was this translation helpful? Give feedback.
All reactions