Skip to content

How to connect LLM via OpenAI compatible API #195

@murarduino

Description

@murarduino

When I used langchain and Langgraph projects before, I was used to connecting to local LLM services using openAI compatible APIs, so I didn't need to consider the troubles caused by different LLM inference engines (llama.cpp/vllm/MindIE can access local LLM running through OpenAI compatible APIs).
So I hope open_deep_research can support this feature as well. Can I do this? How should we do it?

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(openai_api_base="http://10.31.13.16:8088/v1",api_key="123321", model="Qwen3-14B")

return_object=llm.invoke("你好,请介绍一下自己")
print(return_object.content)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions