Wrapping independently trained Pytorch model with Thinc #9734
Replies: 2 comments 5 replies
-
To use this in spacy, you would need to write a custom pipeline component that can load this model and set the right annotation on the doc from the model predictions. There's a similar example in a demo project here (it's for NER instead of POS, but the basics for the pipeline component should be similar): https://github.com/explosion/projects/tree/v3/tutorials/ner_pytorch_medical Look at the custom functions under |
Beta Was this translation helpful? Give feedback.
-
Oh, I see, thank you! I have a couple more questions.
this into the config file? Because when I call the config file only with this, I get the following error:
I tried to fill the file with
But I get the error
Could you please help me understand what am I doing wrong? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I trained the POS tagging model for Korean in Pytorch separately from spaCy. Is it possible to wrap the trained model with PytorchWrapper and add it to the custom pipeline?
For now i did something like this:
from thinc.api import PyTorchWrapper
tagger = "./models/model_pt" tagger_model = PyTorchWrapper(tagger)
Can you please advise me on what to do next from here? How to add this model to the pipeline? Or should I re-train my model from scratch with spaCy?
Beta Was this translation helpful? Give feedback.
All reactions