Support for Batch Processing with Batch API on LiteLLM? #8958
-
Does LiteLLM support Batch Processing? Azure and OpenAI support Batch Processing with Batch API to reduce inference costs by 50%. It is unclear from the LiteLLM docs whether this is supported with Batching Completion or whether it is a different feature only with a similar name. I searched the issues and discussions, but given the naming confusion (and the similar naming), my search did not yield meaningful results. |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 6 replies
-
Beta Was this translation helpful? Give feedback.
-
Hi @rodrigobdz the Batch API is supported - https://docs.litellm.ai/docs/batches |
Beta Was this translation helpful? Give feedback.
-
Hi @krrishdholakia, is batch processing also supported on Anthropic? If not, are there any plans to support it? Thank you! |
Beta Was this translation helpful? Give feedback.
-
Is there support for using batch processing in the Python SDK? |
Beta Was this translation helpful? Give feedback.
Hi @rodrigobdz the Batch API is supported - https://docs.litellm.ai/docs/batches