Entity Linker with Transformer Listener #12329
-
I have a pretrained pipeline composed by a transformer and NER components and I am trying to create an Entity Linker able to use embedding representation produced by the transformer rather than using a CNN. entity_linker = nlp.add_pipe("entity_linker",config = {
"labels_discard": [],
"n_sents": 0,
"incl_prior": True,
"incl_context": True,
"model": {
"@architectures": "spacy.EntityLinker.v2",
"tok2vec": {
"@architectures" : "spacy-transformers.TransformerListener.v1",
"pooling" : {"@layers":"reduce_mean.v1"},
"upstream" : "*"
},
"nO": 768
},
"entity_vector_length": 768,
"get_candidates": {'@misc': 'spacy.CandidateGenerator.v1'},
"threshold": None,
}) but I receive dimension unset Error File "/home/blanco/wiki_kb/.venv/lib/python3.9/site-packages/spacy_transformers/layers/listener.py", line 64, in forward
width = model.get_dim("nO")
File "/home/blanco/wiki_kb/.venv/lib/python3.9/site-packages/thinc/model.py", line 175, in get_dim
raise ValueError(err)
ValueError: Cannot get dimension 'nO' for model 'transformer-listener': value unset Looking online I found that this last problem typically occurs when you try to finetune a late component in the pipeline while freezing the transformer but in my case I am no quite close to that point. My final aim would be to have a Linker that uses transformer embedding for entity and context and (maybe learning a linear layer for reprojection to a lowest dimension or in case no train at all) compares embedding coming from a knowledge base whose embedding representations have been created using the same pretrained pipeline. Thanks in advance to everybody who will reply and help me Current Environment
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 23 replies
-
It's hard to know for sure without seeing your full config, but if you want to use a frozen transformer as input to another component, be sure that you're using If that doesn't help, can you provide the full config and training steps that you're using? |
Beta Was this translation helpful? Give feedback.
-
@rmitsch This sounds very interesting. I've had success with current NEL model and interested in whether you have documented intended plans here or literature or other resources the new work is inspired by? |
Beta Was this translation helpful? Give feedback.
It's hard to know for sure without seeing your full config, but if you want to use a frozen transformer as input to another component, be sure that you're using
spacy
v3.5.0+ andspacy-transformers
v1.2.2+ due to some bug fixes and be sure that you've added it to[training.annotating_components]
.If that doesn't help, can you provide the full config and training steps that you're using?