Skip to content
Discussion options

You must be logged in to vote

With the pre-release of spaCy v3, it should be much easier to integrate transformers in your pipeline. Cf documentation here: https://nightly.spacy.io/usage/embeddings-transformers. This integration works via https://github.com/explosion/spacy-transformers, which can basically be used as an interface to the HuggingFace Transformer model hub. You'll then be able to use that Transformer as the first layer of your neural network models in spaCy, by "swapping out" the "standard" tok2vec layer. This should hopefully give you a nice performance boost, like we obtained ourselves with our pretrained models.

I hope that answers your question? If not, let me know!

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ines
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
2 participants