Is a LangGraph compiled graph thread-safe / advised for concurrent use? #1211
-
I just wanted to validate if it's ok to initialize/compile the graph once and then use it to serve multiple parallel requests in a web application. In other words is the shared state passed from node to node is thread-safe? E.g. is it advisable to do something like this (kinda pseudo-code)? Or is there an advantage to re-compiling the graph for each new request? from fastapi import FastAPI
from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
from pydantic import BaseModel
app = FastAPI()
class Question(BaseModel):
question: str
# Global variable to store the chatbot
chatbot = None
@app.on_event("startup")
async def startup_event():
global chatbot
model = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)
chatbot = create_react_agent(model, [])
@app.post("/ask")
async def ask_question(question: Question):
response = chatbot.ainvoke({"messages": [HumanMessage(content=question.question)]})
return {"answer": response}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000) (Also asked in langchain-ai/langchain#23630) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
It is entirely safe to share a graph between executions, whether they happen concurrently or not, whether in same thread or not. No state is ever stored on the graph instance, and the graph instance isn't ever mutated in any way during any execution of the graph |
Beta Was this translation helpful? Give feedback.
It is entirely safe to share a graph between executions, whether they happen concurrently or not, whether in same thread or not. No state is ever stored on the graph instance, and the graph instance isn't ever mutated in any way during any execution of the graph