Running jina-embeddings-v2-base-en embedding model LOCALLY #31551
Replies: 1 comment 6 replies
-
There isn't a built-in way to use JinaEmbeddings locally in LangChain without an API key—JinaEmbeddings is designed to work with the remote Jina API and always requires a key, with no documented support for local or offline use in the official code or docs (source). However, since you can run the model locally with SentenceTransformer, you can use the HuggingFaceEmbeddings or SentenceTransformerEmbeddings integration in LangChain to wrap your local model, just like you would with any other HuggingFace-compatible embedding model. This approach doesn't require an API key and works entirely offline. If you want a LangChain Embeddings interface for your local model, you can do something like: from langchain_community.embeddings import HuggingFaceEmbeddings
embeddings = HuggingFaceEmbeddings(
model_name="jinaai/jina-embeddings-v2-base-en",
model_kwargs={"trust_remote_code": True}
) Or, if you want more control, you can subclass the Embeddings interface and wrap your SentenceTransformer instance directly. Let me know if you want a code example for that! To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to use Jinal embedding model:
jina-embeddings-v2-base-en
LOCALLY.I can run it locally in this form:
model = SentenceTransformer("jinaai/jina-embeddings-v2-base-code", trust_remote_code=True)
but using the Jinambeddings API requires an API KEY, meaning this will not run locally on my machine:
I was wondering how can I use this embedding model locally, like I do with JinaEmbeddings or HuggingFace API
Beta Was this translation helpful? Give feedback.
All reactions