Skip to content
Discussion options

You must be logged in to vote

This discussion seems like it might fit in the HuggingFace forums, depending on how you approach the problem. Keep in mind that the actual Transformers implementation in spaCy is just HuggingFace Transformers, and spacy-transformers is a wrapper around that.

If you're using a CNN tok2vec in spaCy, you can specify the token position as one of the attributes in the model config. Because the actual BERT implementation is inside the HuggingFace library, I don't think there's a similar option to control how that's handled in the spaCy config.

If you want to add context, take a look at how the EntityLinker is implemented - it uses context to disambiguate entities. When adding context what you d…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@Larsdegroot
Comment options

@polm
Comment options

Answer selected by Larsdegroot
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
2 participants