Loading and using a HuggingFace model in code #10579
-
Wasn't sure if this was worth attaching to #7027 or discussing here: I was having issues trying to load a non-default Transformer model in code and manipulate the attributes (e.g. use
An example based on the workaround (my edit)
Does work. (Though it seems to behave strangely compared to when you just load something like Would it make sense to update the docs? Is there something I'm missing here? This all works like a dream using the config+CLI workflow, but I'm specifically doing some experimentation in code, so it'd be easier if I can just load up a HF model. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
You can assemble a transformer pipeline with nlp = spacy.blank("en")
nlp.add_pipe("transformer") # set your transformer config in kwarg `config`
nlp.initialize() Then We can think about how to update the examples here, because I can see how it's a bit confusing. It's intended to be more of a sketch to show how that individual method is used internally rather than a complete example, which would always require initialization. |
Beta Was this translation helpful? Give feedback.
-
There is an FAQ post about the right way to do this now: #10768 |
Beta Was this translation helpful? Give feedback.
You can assemble a transformer pipeline with
spacy assemble
just from a config, or programmatically it's equivalent to what you posted:Then
doc._.trf_data
should contain the transformer output similar to thetrf
pipelines, which you can use with user hooks for things like similarity. (Although at the doc level in many cases other alternatives likesentence-transformers
can be better.)We can think about how to update the examples here, because I can see how it's a bit confusing. It's intended to be more of a sketch to show how that individual method is used internally rat…