-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Open
Description
When I used langchain and Langgraph projects before, I was used to connecting to local LLM services using openAI compatible APIs, so I didn't need to consider the troubles caused by different LLM inference engines (llama.cpp/vllm/MindIE can access local LLM running through OpenAI compatible APIs).
So I hope open_deep_research can support this feature as well. Can I do this? How should we do it?
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(openai_api_base="http://10.31.13.16:8088/v1",api_key="123321", model="Qwen3-14B")
return_object=llm.invoke("你好,请介绍一下自己")
print(return_object.content)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels