Skip to content

config with thread_id error #36

@LizhongLiu-cims

Description

@LizhongLiu-cims

I try to run code here: examples/langgraph-python/langgraph_e2b_python/code_interpreter_tool.py

All my code is same as the example, expect that I added the memorysaver to enable the llm have chat memory for constant chat experience.


workflow = MessageGraph()
workflow.add_node("agent", llm.bind_tools(tools))
workflow.add_node("action", lambda x: execute_tools(x, tool_map))

#Conditional agent -> action OR agent -> END
workflow.add_conditional_edges(
"agent",
should_continue,
)
#Always transition action -> agent
workflow.add_edge("action", "agent")

workflow.set_entry_point("agent")
memory = MemorySaver()
app = workflow.compile(checkpointer = memory)

#4. Invoke the app
config = {"configurable": {"thread_id": str(uuid.uuid4())}}
result = app.invoke(input=("user", "please generate a random line plots"), config = config)


I got error message below when try to add config argument:


venv\lib\site-packages\langgraph\serde\jsonplus.py:72, in JsonPlusSerializer._default(self, obj)
68 return self._encode_constructor_args(
69 obj.class, kwargs={"node": obj.node, "arg": obj.arg}
70 )
71 else:
---> 72 raise TypeError(
73 f"Object of type {obj.class.name} is not JSON serializable"
74 )

TypeError: Object of type Result is not JSON serializable


I suppose the error comes due to change of message format that can not be json.dump(), is there any quick way to fix this? like some parser function maybe?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions