Add bind tools to custom BaseChatModel #26146
Replies: 4 comments 9 replies
-
To bind tools to your custom
Here's how you can implement it: from langchain_core.pydantic_v1 import BaseModel, Field
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_core.utils.function_calling import convert_to_openai_tool
# Define your tools
class GetWeather(BaseModel):
'''Get the current weather in a given location'''
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
class GetPopulation(BaseModel):
'''Get the current population in a given location'''
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
# Your custom chat model
class GPT4oCustomChatModel(BaseChatModel):
# ... (your existing implementation)
def bind_tools(self, tools):
formatted_tools = [convert_to_openai_tool(tool) for tool in tools]
return super().bind_tools(formatted_tools)
# Initialize your custom chat model
llm = GPT4oCustomChatModel(
client_id="your_client_id",
client_secret="your_client_secret",
apim_Subscription_Key="your_subscription_key",
api_url="https://api.yourservice.com/v1",
model_name="gpt-4o",
token_manager=TokenManager(),
temperature=0.7,
top_p=0.95,
max_tokens=8000,
)
# Bind the tools to the chat model
tools = [GetWeather, GetPopulation]
llm_with_tools = llm.bind_tools(tools)
# Example usage
ai_msg = llm_with_tools.invoke("Which city is hotter today and which is bigger: LA or NY?")
print(ai_msg.tool_calls) This code defines two tools, For more advanced use cases, such as creating agents that can interact with multiple tools and handle more complex workflows, you might want to explore the |
Beta Was this translation helpful? Give feedback.
-
Hi @santiagovasquez1 , Facing same issue, can you please let me know how you solved this problem. but my customchatmodel class is not able to recognize presence of any tool. Any thoughts please ? |
Beta Was this translation helpful? Give feedback.
-
Hi Santiago, how is it going? Is your custom chat model finally well implemented for the tool binding? Is it public anywhere? Im getting the same issues as you, thanks in advance! |
Beta Was this translation helpful? Give feedback.
-
copy specific bind_tools functions. for example class BaseChatOpenAI(BaseChatModel):
...
def bind_tools(
self,
tools: Sequence[Union[Dict[str, Any], Type, Callable, BaseTool]],
*,
tool_choice: Optional[
Union[dict, str, Literal["auto", "none", "required", "any"], bool]
] = None,
strict: Optional[bool] = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, BaseMessage]:
"""Bind tool-like objects to this chat model.
Assumes model is compatible with OpenAI tool-calling API.
Args:
tools: A list of tool definitions to bind to this chat model.
Supports any tool definition handled by
:meth:`langchain_core.utils.function_calling.convert_to_openai_tool`.
tool_choice: Which tool to require the model to call. Options are:
- str of the form ``"<<tool_name>>"``: calls <<tool_name>> tool.
- ``"auto"``: automatically selects a tool (including no tool).
- ``"none"``: does not call a tool.
- ``"any"`` or ``"required"`` or ``True``: force at least one tool to be called.
- dict of the form ``{"type": "function", "function": {"name": <<tool_name>>}}``: calls <<tool_name>> tool.
- ``False`` or ``None``: no effect, default OpenAI behavior.
strict: If True, model output is guaranteed to exactly match the JSON Schema
provided in the tool definition. If True, the input schema will be
validated according to
https://platform.openai.com/docs/guides/structured-outputs/supported-schemas.
If False, input schema will not be validated and model output will not
be validated.
If None, ``strict`` argument will not be passed to the model.
kwargs: Any additional parameters are passed directly to
:meth:`~langchain_openai.chat_models.base.ChatOpenAI.bind`.
.. versionchanged:: 0.1.21
Support for ``strict`` argument added.
""" # noqa: E501
formatted_tools = [
convert_to_openai_tool(tool, strict=strict) for tool in tools
]
if tool_choice:
if isinstance(tool_choice, str):
# tool_choice is a tool/function name
if tool_choice not in ("auto", "none", "any", "required"):
tool_choice = {
"type": "function",
"function": {"name": tool_choice},
}
# 'any' is not natively supported by OpenAI API.
# We support 'any' since other models use this instead of 'required'.
if tool_choice == "any":
tool_choice = "required"
elif isinstance(tool_choice, bool):
tool_choice = "required"
elif isinstance(tool_choice, dict):
tool_names = [
formatted_tool["function"]["name"]
for formatted_tool in formatted_tools
]
if not any(
tool_name == tool_choice["function"]["name"]
for tool_name in tool_names
):
raise ValueError(
f"Tool choice {tool_choice} was specified, but the only "
f"provided tools were {tool_names}."
)
else:
raise ValueError(
f"Unrecognized tool_choice type. Expected str, bool or dict. "
f"Received: {tool_choice}"
)
kwargs["tool_choice"] = tool_choice
return super().bind(tools=formatted_tools, **kwargs) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I have a custom BaseChatModel that call a gpt 4.o mini via a rest api via url I need bind tools to it, how can I do that?
System Info
langchain==0.2.14
langchain-cli==0.0.30
langchain-community==0.2.12
langchain-core==0.2.35
langchain-experimental==0.0.64
langchain-openai==0.1.22
langchain-text-splitters==0.2.2
Beta Was this translation helpful? Give feedback.
All reactions