Skip to content

Commit 93af8fd

Browse files
authored
[QA] E2E - Testing for bedrock batches api (#14525)
* add bedrock/batch-anthropic.claude-3-5-sonnet-20240620-v1:0 * test_bedrock_batches_api * fix * fix import * test_bedrock_batches_api
1 parent 93d6e9b commit 93af8fd

File tree

7 files changed

+54
-184
lines changed

7 files changed

+54
-184
lines changed

docs/my-website/docs/batches.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Covers Batches, Files
77

88
| Feature | Supported | Notes |
99
|-------|-------|-------|
10-
| Supported Providers | OpenAI, Azure, Vertex, Bedrock | - |
10+
| Supported Providers | OpenAI, Azure, Vertex | - |
1111
| ✨ Cost Tracking || LiteLLM Enterprise only |
1212
| Logging || Works across all logging integrations |
1313

@@ -178,7 +178,6 @@ print("list_batches_response=", list_batches_response)
178178
### [Azure OpenAI](./providers/azure#azure-batches-api)
179179
### [OpenAI](#quick-start)
180180
### [Vertex AI](./providers/vertex#batch-apis)
181-
### [Bedrock](./providers/bedrock_batches)
182181

183182

184183
## How Cost Tracking for Batches API Works

docs/my-website/docs/providers/bedrock_batches.md

Lines changed: 0 additions & 180 deletions
This file was deleted.

docs/my-website/sidebars.js

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -410,7 +410,6 @@ const sidebars = {
410410
items: [
411411
"providers/bedrock",
412412
"providers/bedrock_agents",
413-
"providers/bedrock_batches",
414413
"providers/bedrock_vector_store",
415414
]
416415
},

litellm/proxy/example_config_yaml/oai_misc_config.yaml

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,19 @@ model_list:
1818
litellm_params:
1919
model: "groq/*"
2020
api_key: os.environ/GROQ_API_KEY
21+
- model_name: bedrock/batch-anthropic.claude-3-5-sonnet-20240620-v1:0
22+
litellm_params:
23+
model: bedrock/us.anthropic.claude-3-5-sonnet-20240620-v1:0
24+
#########################################################
25+
########## batch specific params ########################
26+
s3_bucket_name: litellm-proxy
27+
s3_region_name: us-west-2
28+
s3_access_key_id: os.environ/AWS_ACCESS_KEY_ID
29+
s3_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
30+
aws_batch_role_arn: arn:aws:iam::888602223428:role/service-role/AmazonBedrockExecutionRoleForAgents_BB9HNW6V4CV
31+
model_info:
32+
mode: batch
33+
2134
litellm_settings:
2235
# set_verbose: True # Uncomment this if you want to see verbose logs; not recommended in production
2336
drop_params: True

litellm/proxy/proxy_config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,4 @@ model_list:
1010
s3_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
1111
aws_batch_role_arn: arn:aws:iam::888602223428:role/service-role/AmazonBedrockExecutionRoleForAgents_BB9HNW6V4CV
1212
model_info:
13-
mode: batch
13+
mode: batch
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
{"custom_id": "request-1", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "bedrock/us.anthropic.claude-3-5-sonnet-20240620-v1:0", "messages": [{"role": "system", "content": "You are a helpful assistant."},{"role": "user", "content": "Hello world!"}],"max_tokens": 10}}
2+
{"custom_id": "request-2", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "bedrock/us.anthropic.claude-3-5-sonnet-20240620-v1:0", "messages": [{"role": "system", "content": "You are an unhelpful assistant."},{"role": "user", "content": "Hello world!"}],"max_tokens": 10}}
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
from openai import OpenAI
2+
import pytest
3+
4+
client = OpenAI(
5+
base_url="http://0.0.0.0:4000",
6+
api_key="sk-1234",
7+
)
8+
9+
10+
BEDROCK_BATCH_MODEL = "bedrock/batch-anthropic.claude-3-5-sonnet-20240620-v1:0"
11+
12+
13+
@pytest.mark.asyncio
14+
async def test_bedrock_batches_api():
15+
"""
16+
Test bedrock batches api
17+
18+
E2E Test Creating a File and a Batch on Bedrock
19+
"""
20+
# Upload file
21+
batch_input_file = client.files.create(
22+
file=open("tests/openai_endpoints_tests/bedrock_batch_completions.jsonl", "rb"),
23+
purpose="batch",
24+
extra_body={"target_model_names": BEDROCK_BATCH_MODEL}
25+
)
26+
print(batch_input_file)
27+
28+
# Create batch
29+
batch = client.batches.create(
30+
input_file_id=batch_input_file.id,
31+
endpoint="/v1/chat/completions",
32+
completion_window="24h",
33+
metadata={"description": "Test batch job"},
34+
)
35+
print(batch)
36+
37+
assert batch.id is not None

0 commit comments

Comments
 (0)