You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tool sends the media to the user via WhatsApp Cloud API.
The tool is defined with LangChain’s decorator like this:
fromlangchain_core.toolsimporttools@tooldefRespondWithMedia(media_type: str, media_description, caption: str="") ->dict:
""" Send the user WhatsApp videos. Args: media_type: 'wedding', 'anniversary', or 'birthday'. media_description: - if media_type = 'wedding' → '2d sample' OR '3d sample with caricature' - if anniversary/birthday → '2d with caricature' caption: optional caption for the video """print("RespondWithMedia Tool Running!") # <-- This line is for debugging which never fires suggesting that tool is never executedid_list= []
withconn.cursor() ascurr:
sql_query="""SELECT media_id FROM sample_media_library WHERE media_type = %s AND media_description = %s"""curr.execute(sql_query, (media_type.lower(), media_description.lower()))
result=curr.fetchall()
forrowinresult:
id_list.append(row[0])
user_ph=RunnableConfig.get_current().get("configurable", {}).get("thread_id")
print("Sending to Phone: ", user_ph)
url=f"https://graph.facebook.com/v23.0/{WHATSAPP_PHONE_NUMBER_ID}/messages"headers= {
"Authorization": f"Bearer {WHATSAPP_ACCESS_TOKEN}",
"Content-Type": "application/json",
}
responses= []
formedia_idinid_list:
data= {
"messaging_product": "whatsapp",
"recipient_type": "individual",
"to": f"{user_ph}",
"type": "video",
"video": {"id": media_id, "caption": caption},
}
try:
resp=requests.post(url, headers=headers, json=data)
responses.append({"media_id": media_id, "status": resp.status_code})
exceptrequests.RequestExceptionase:
responses.append({"media_id": media_id, "status": "failed", "error": str(e)})
return {"results": responses}
prompt_template=ChatPromptTemplate.from_messages([
("system", "{system_message}"),
MessagesPlaceholder("messages")
])
tool_node=ToolNode([RespondWithMedia])
llm=init_chat_model("openai:gpt-4.1")
llm_model=llm.bind_tools([RespondWithMedia])
agent=prompt_template|llm_modelwithopen("sys_prompt.txt") asf:
SYSTEM_PROMPT=f.read()
defshould_continue(state: State):
messages=state["messages"]
last=messages[-1]
ifhasattr(last, "tool_calls") andlast.tool_calls:
return"tool"returnENDdefchatbot(state: State):
ai_resp=agent.invoke({
"system_message": SYSTEM_PROMPT,
"messages": state["messages"]
})
print(ai_resp)
return {"messages": [ai_resp]}
graph_builder=StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_node("tool", tool_node)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_conditional_edges("chatbot", should_continue, ["tool", END])
graph_builder.add_edge("tool", "chatbot")
graph=graph_builder.compile(checkpointer=checkpointer)
defstream_graph_updates(user_ph: int, user_input: str) ->dict:
foreventingraph.stream({"messages":[{"role":"user", "content": user_input}]}, config= {"configurable": {"thread_id": user_ph}}):
forvalueinevent.values():
print(value['messages'][-1])
ai_resp=value['messages'][-1].contentmetadata=value['messages'][-1].usage_metadataresp= {
"content": ai_resp,
"metadata": metadata
}
returnresp
Values like user_ph and user_input are passed into the stream_graph_updates() from the payload received on the flask webhook.
When I ask the bot on whatsapp to send me sample files for wedding, It does nothing, but on the backend, the logs suggest that the bot does generate correct arguments for the tool,
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I’m building an AI-powered WhatsApp chatbot using flask, langchain, langgraph, postgres database(not vector store), and whatsapp's cloud api.
The workflow I’m aiming for:
The tool is defined with LangChain’s decorator like this:
Values like
user_ph
anduser_input
are passed into thestream_graph_updates()
from the payload received on the flask webhook.When I ask the bot on whatsapp to send me sample files for wedding, It does nothing, but on the backend, the logs suggest that the bot does generate correct arguments for the tool,
The arguments perfectly match with the values in the database so I don't suspect that it fails.
Beta Was this translation helpful? Give feedback.
All reactions