-
Notifications
You must be signed in to change notification settings - Fork 137
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What are you really trying to do?
Using the openai agent sdk with temporal and LiteLLM with aws bedrock. For example the anthropic models.
Describe the bug
When you get rate limited by aws bedrock it throws an litellm.exceptions.ServiceUnavailableError
For example:
litellm.ServiceUnavailableError: BedrockException - {"message":"Model is getting throttled. Try your request again."}
This is a retryable error and should produce a retry of the activity. Instead it results in this error
temporalio.exceptions.ApplicationError: Non retryable OpenAI status code
Minimal Reproduction
Setup one of the openAI agent SDK temporal samples and use the LiteLLM provider. Use one of the models that has a lot of throttling and wait until you get throttled/rate limited.
Environment/Versions
- OS and processor: [e.g. M1 Mac, x86 Windows, Linux]
- Temporal Version: [e.g. 1.14.0?] and/or SDK version
- Are you using Docker or Kubernetes or building Temporal from source?
Temporal python sdk 1.18.1
Additional context
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working