Skip to content
Discussion options

You must be logged in to vote

Sorry for the delayed reply on this! Unfortunately our standard reply here is that spacy-transformers simply doesn't support task-specific heads. You can use the Transformer as a source of features and train spaCy native NER layers.

It may be possible to implement a workaround wrapping the model in Thinc, though I think it'd be pretty involved and I'm not sure anyone has done that before.

Another thing you can do is keep the Transformer model separate from spaCy, and just use the annotations from it to create Docs manually. That's pretty inefficient and loses a lot of the benefits of spaCy, so we don't generally recommend it, but it can still be worthwhile if you have a lot of postprocess…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by svlandeg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
third-party Third-party packages and services feat / pipeline Feature: Processing pipeline and components feat / transformer Feature: Transformer
2 participants