Massive Parallel Graph Execution with Batch APIs #3595
EliezerIsrael
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
Hi, this is a question we think a bunch about. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've put together a LangGraph StateGraph that takes a foreign language word and its context, and iterates on some dictionary tool calls until it determines the right dictionary entry. It works well, word by word.
I want to run it against a huge corpus - many hundreds of thousands of words. I'd like to use a batch API, like the Anthropic Batch API. That can execute thousands of calls in parallel, but asynchronously.
Effectively, I'd like to run many thousands of graphs simultaneously, running the collection of current LLM calls across all of the graphs, then collecting the responses some time later, advancing each graph to the next step, until they all resolve.
I can imagine a way to handle it - run the graphs until they reach an LLM call, batch the calls, resume execution when the batch completes, rinse and repeat.
I imagine this is a repeating pattern for large scale async loads. Is there any kind of scaffolding for this already built? I'm going to solve this one way or the other - if I'm barking up the right tree here, I'm happy to take some architectural notes to make it more broadly usable.
Beta Was this translation helpful? Give feedback.
All reactions