Is compiled graph thread-safe in Langgraph? #23630
-
Checked other resources
Commit to Help
Example Codefrom fastapi import FastAPI
from langchain_community.retrievers import TavilySearchAPIRetriever
from langchain.chat_models import ChatOpenAI
from your_module import create_react_agent # Import your create_react_agent function
from pydantic import BaseModel
app = FastAPI()
class Question(BaseModel):
question: str
# Global variable to store the chatbot
chatbot = None
@app.on_event("startup")
async def startup_event():
global chatbot
model = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)
chatbot = create_react_agent(model, [TavilySearchAPIRetriever()])
@app.post("/ask")
async def ask_question(question: Question):
response = chatbot.invoke({"question": question.question})
return {"answer": response}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000) DescriptionI just wanted to validate if it's ok to initialize/compile the graph once and then use it to serve multiple parallel requests in a web application. In other words is the shared state passed from node to node is thread-safe? System Infopip install langgraph tavily-python |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 2 replies
-
Hey @zhaidarbek! I'm here to help you with any questions or issues you have while waiting for a human maintainer. Let's tackle this together! The LangGraph framework does not explicitly mention thread safety for compiled graphs. However, based on the provided information and the general principles of concurrency, here are some considerations:
Given the example code you provided, here are some specific points:
In summary, while the LangGraph framework does not explicitly guarantee thread safety for compiled graphs, careful design and testing can help ensure that your application handles concurrent requests safely. |
Beta Was this translation helpful? Give feedback.
-
I had the same question and asked in langchain-ai/langgraph#1211 since I didn't see a response here. |
Beta Was this translation helpful? Give feedback.
-
From here: "It is entirely safe to share a graph between executions, whether they happen concurrently or not, whether in same thread or not. No state is ever stored on the graph instance, and the graph instance isn't ever mutated in any way during any execution of the graph" |
Beta Was this translation helpful? Give feedback.
-
What would happen if you hit with multi threads at the same instant ?
Rupinder
From: Joshua Carroll ***@***.***>
Reply-To: langchain-ai/langchain ***@***.***>
Date: Thursday, October 3, 2024 at 8:48 PM
To: langchain-ai/langchain ***@***.***>
Cc: "Grewal, Rupinder" ***@***.***>, Comment ***@***.***>
Subject: Re: [langchain-ai/langchain] Is compiled graph thread-safe in Langgraph? (Discussion #23630)
Can you say more about the concern? I believe the global variable is only being initialized and written once during the FastAPI startup event, and then invoked whenever the route is called.
The statement from the LangChain team is that invocation itself should be thread safe. So it's not obvious to me that it would ever be over written. But maybe I'm missing something.
—
Reply to this email directly, view it on GitHub<#23630 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AYMBZRRCBHAXBW52QKZY26DZZYFWHAVCNFSM6AAAAABKB3ED7KVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTAOBTHE2TKNA>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
From here:
"It is entirely safe to share a graph between executions, whether they happen concurrently or not, whether in same thread or not. No state is ever stored on the graph instance, and the graph instance isn't ever mutated in any way during any execution of the graph"