langgraph/how-tos/create-react-agent-structured-output/ #3574
Replies: 9 comments 9 replies
-
I got error |
Beta Was this translation helpful? Give feedback.
-
For some reason the agent responds well in the message responses but returns something completely different on the structured output every time. Has anyone encountered this issue? |
Beta Was this translation helpful? Give feedback.
-
Is there a JS version to achieve this using 2 LLMs? Binding output as tool is giving me invalid_tool_calls when streaming with streamEvents |
Beta Was this translation helpful? Give feedback.
-
You guys aware you can have both tools and structured output with OpenAI models? No need to make is a tool. |
Beta Was this translation helpful? Give feedback.
-
Can anyone please suggest if there is a way to overcome context window length when working with SQLDatabasetoolkit. The agent runs into error with stop reason=length as the context length exceeding in case of large output returned by database. Appreciate your help thank you |
Beta Was this translation helpful? Give feedback.
-
Hi Guys, I'm working with this prebuilt agent: Configuration for the agent executorconfig_1 = {"configurable": {"thread_id": "1"}} One of the tools I'm using is defined like this: def tool_plot_field_production_rates_wrapper(args: str) -> str:
Tool registration is done as follows: tools = [ Question: Any possibility to stop the react agent doing this? Any help is good!!! |
Beta Was this translation helpful? Give feedback.
-
I am working on structured data retrieval using this create_react_agent. For the first user input, it is generating the response. If the second user input is like, regenerate the sql query, it has to call the generate_sql_query tool and regenrate the sql query. |
Beta Was this translation helpful? Give feedback.
-
Has any one tried using local huggingface models? I tried multiple LLMs with the huggingface pipeline, but none of them successfully called the tools. Please see my code below: from langchain_huggingface import HuggingFacePipeline
llm = HuggingFacePipeline.from_model_id(
model_id=model_name,
task="text-generation",
device_map='auto',
pipeline_kwargs=dict(
max_new_tokens=512,
# do_sample=False,
temperature=0.2,
repetition_penalty=1.03,
),
)
model = ChatHuggingFace(llm=llm) |
Beta Was this translation helpful? Give feedback.
-
In JS version, you can use ChatOpenAI.bind to add both structured response and a tool call. The following code shows how create an LLM that calls a tool and binds a reponse form at to the LLM:
More parameters can be set to the OpenAI LLM in langchain. Here's a list of parameters that can be set including modalities and reasoning. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
langgraph/how-tos/create-react-agent-structured-output/
Build language agents as graphs
https://langchain-ai.github.io/langgraph/how-tos/create-react-agent-structured-output/
Beta Was this translation helpful? Give feedback.
All reactions