Replies: 1 comment
-
I was also looking for a way to display the tool name during streaming for my application. Printing the entire This is what worked for me, using LangGraph version 0.4.7 with a graph that has nodes named for msg, metadata in app.stream(inputs, config, stream_mode="messages"):
# Check for non-empty AI message
if msg.content and not isinstance(msg, HumanMessage):
# Adjust output depending on node name
node_name = metadata.get("langgraph_node", None)
if node_name == "chatbot":
# Stream the LLM response
print(msg.content, end="", flush=True)
elif node_name == "tools":
# Check that this is the final output of a tool
# (not intermediate LLM output from inside a tool)
if not msg.name is None:
# Print the tool's name
print(f"*** Called tool: {msg.name} ***")
else:
print(f"Unexpected node name: {node_name}") |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Extract Tool Metadata in
graph.stream
When using the
graph.stream
function with the parameterstream_mode="messages"
, is it possible to extract metadata associated with the tools being used?. This will include details such as the tool name and other relevant attributes.I am using the ToolNode to handle tool_llms calls.
Example Usage:
Common Use Cases:
Beta Was this translation helpful? Give feedback.
All reactions