Bug in create_react_agent with response_format?? #3845
-
Hi,
The prompt asks the agent to pick one of ENUMs in I debugged and found that two calls were happening in Adding the system prompt to the front of the messages solves the issue. But somewhere along the way the system prompt given to the llm when using response_format disappear from the final llm call. This seems like a bug, doesn't it? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
it's not a bug, but i can see how this can be confusing - you can pass dedicated prompt for the structured output LLM call by passing a tuple |
Beta Was this translation helpful? Give feedback.
it's not a bug, but i can see how this can be confusing - you can pass dedicated prompt for the structured output LLM call by passing a tuple
(prompt, Schema)
toresponse_format
-- see API reference https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent