Replies: 2 comments 5 replies
-
|
Beta Was this translation helpful? Give feedback.
3 replies
-
I think you can use configuration for this: from langchain_core.tools import tool
from langchain.chat_models import init_chat_model
from langgraph.prebuilt import create_react_agent
llm = init_chat_model(
"openai:gpt-4o-mini",
configurable_fields=("temperature",),
)
@tool
def get_weather(location: str):
"""Get the weather."""
return "It's sunny."
agent = create_react_agent(llm, [get_weather])
agent.invoke(
{"messages": "What's the weather in Boston?"},
{"configurable": {"temperature": 100}}, # raises BadRequestError
) |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Unlike ChatModel.invoke, prebuilt React agent .invoke() does not allow passing any additional
kwargs
that flow into the underlying ChatModel in order to customize parameters such astemperature
.Beta Was this translation helpful? Give feedback.
All reactions