Skip to content

Commit e76ed29

Browse files
DennisTraubmonadierickx
authored andcommitted
Python: Bedrock Runtime document understanding examples (awsdocs#7446)
1 parent 28a58ad commit e76ed29

File tree

11 files changed

+475
-3
lines changed

11 files changed

+475
-3
lines changed

.doc_gen/metadata/bedrock-runtime_metadata.yaml

Lines changed: 103 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1530,6 +1530,109 @@ bedrock-runtime_InvokeModelWithResponseStream_TitanTextEmbeddings:
15301530
services:
15311531
bedrock-runtime: {InvokeModel}
15321532

1533+
# Document understanding
1534+
bedrock-runtime_DocumentUnderstanding_AmazonNova:
1535+
title: Send and process a document with Amazon Nova on &BR;
1536+
title_abbrev: "Document understanding"
1537+
synopsis: send and process a document with Amazon Nova on &BR;.
1538+
category: Amazon Nova
1539+
languages:
1540+
Python:
1541+
versions:
1542+
- sdk_version: 3
1543+
github: python/example_code/bedrock-runtime
1544+
excerpts:
1545+
- description: Send and process a document with Amazon Nova on &BR;.
1546+
snippet_tags:
1547+
- python.example_code.bedrock-runtime.DocumentUnderstanding_AmazonNovaText
1548+
services:
1549+
bedrock-runtime: {Converse}
1550+
1551+
bedrock-runtime_DocumentUnderstanding_AnthropicClaude:
1552+
title: Send and process a document with Anthropic Claude on &BR;
1553+
title_abbrev: "Document understanding"
1554+
synopsis: send and process a document with Anthropic Claude on &BR;.
1555+
category: Anthropic Claude
1556+
languages:
1557+
Python:
1558+
versions:
1559+
- sdk_version: 3
1560+
github: python/example_code/bedrock-runtime
1561+
excerpts:
1562+
- description: Send and process a document with Anthropic Claude on &BR;.
1563+
snippet_tags:
1564+
- python.example_code.bedrock-runtime.DocumentUnderstanding_AnthropicClaude
1565+
services:
1566+
bedrock-runtime: {Converse}
1567+
1568+
bedrock-runtime_DocumentUnderstanding_CohereCommand:
1569+
title: Send and process a document with Cohere Command models on &BR;
1570+
title_abbrev: "Document understanding"
1571+
synopsis: send and process a document with Cohere Command models on &BR;.
1572+
category: Cohere Command
1573+
languages:
1574+
Python:
1575+
versions:
1576+
- sdk_version: 3
1577+
github: python/example_code/bedrock-runtime
1578+
excerpts:
1579+
- description: Send and process a document with Cohere Command models on &BR;.
1580+
snippet_tags:
1581+
- python.example_code.bedrock-runtime.DocumentUnderstanding_CohereCommand
1582+
services:
1583+
bedrock-runtime: {Converse}
1584+
1585+
bedrock-runtime_DocumentUnderstanding_DeepSeek:
1586+
title: Send and process a document with DeepSeek on &BR;
1587+
title_abbrev: "Document understanding"
1588+
synopsis: send and process a document with DeepSeek on &BR;.
1589+
category: DeepSeek
1590+
languages:
1591+
Python:
1592+
versions:
1593+
- sdk_version: 3
1594+
github: python/example_code/bedrock-runtime
1595+
excerpts:
1596+
- description: Send and process a document with DeepSeek on &BR;.
1597+
snippet_tags:
1598+
- python.example_code.bedrock-runtime.DocumentUnderstanding_DeepSeek
1599+
services:
1600+
bedrock-runtime: {Converse}
1601+
1602+
bedrock-runtime_DocumentUnderstanding_MetaLlama:
1603+
title: Send and process a document with Llama on &BR;
1604+
title_abbrev: "Document understanding"
1605+
synopsis: send and process a document with Llama on &BR;.
1606+
category: Meta Llama
1607+
languages:
1608+
Python:
1609+
versions:
1610+
- sdk_version: 3
1611+
github: python/example_code/bedrock-runtime
1612+
excerpts:
1613+
- description: Send and process a document with Llama on &BR;.
1614+
snippet_tags:
1615+
- python.example_code.bedrock-runtime.DocumentUnderstanding_MetaLlama
1616+
services:
1617+
bedrock-runtime: {Converse}
1618+
1619+
bedrock-runtime_DocumentUnderstanding_Mistral:
1620+
title: Send and process a document with Mistral models on &BR;
1621+
title_abbrev: "Document understanding"
1622+
synopsis: send and process a document with Mistral models on &BR;.
1623+
category: Mistral AI
1624+
languages:
1625+
Python:
1626+
versions:
1627+
- sdk_version: 3
1628+
github: python/example_code/bedrock-runtime
1629+
excerpts:
1630+
- description: Send and process a document with Mistral models on &BR;.
1631+
snippet_tags:
1632+
- python.example_code.bedrock-runtime.DocumentUnderstanding_Mistral
1633+
services:
1634+
bedrock-runtime: {Converse}
1635+
15331636
# Tool use scenarios
15341637
bedrock-runtime_Scenario_ToolUseDemo_AmazonNova:
15351638
title: "A tool use demo illustrating how to connect AI models on &BR; with a custom tool or API"

python/example_code/bedrock-runtime/README.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -60,6 +60,7 @@ functions within the same service.
6060

6161
- [Converse](models/amazon_nova/amazon_nova_text/converse.py#L4)
6262
- [ConverseStream](models/amazon_nova/amazon_nova_text/converse_stream.py#L4)
63+
- [Document understanding](models/amazon_nova/amazon_nova_text/document_understanding.py#L4)
6364

6465
### Amazon Nova Canvas
6566

@@ -88,6 +89,7 @@ functions within the same service.
8889

8990
- [Converse](models/anthropic_claude/converse.py#L4)
9091
- [ConverseStream](models/anthropic_claude/converse_stream.py#L4)
92+
- [Document understanding](models/anthropic_claude/document_understanding.py#L4)
9193
- [InvokeModel](models/anthropic_claude/invoke_model.py#L4)
9294
- [InvokeModelWithResponseStream](models/anthropic_claude/invoke_model_with_response_stream.py#L4)
9395
- [Scenario: Tool use with the Converse API](cross-model-scenarios/tool_use_demo/tool_use_demo.py)
@@ -96,23 +98,30 @@ functions within the same service.
9698

9799
- [Converse](models/cohere_command/converse.py#L4)
98100
- [ConverseStream](models/cohere_command/converse_stream.py#L4)
101+
- [Document understanding](models/cohere_command/document_understanding.py#L4)
99102
- [InvokeModel: Command R and R+](models/cohere_command/command_r_invoke_model.py#L4)
100103
- [InvokeModel: Command and Command Light](models/cohere_command/command_invoke_model.py#L4)
101104
- [InvokeModelWithResponseStream: Command R and R+](models/cohere_command/command_r_invoke_model_with_response_stream.py#L4)
102105
- [InvokeModelWithResponseStream: Command and Command Light](models/cohere_command/command_invoke_model_with_response_stream.py#L4)
103106
- [Scenario: Tool use with the Converse API](cross-model-scenarios/tool_use_demo/tool_use_demo.py)
104107

108+
### DeepSeek
109+
110+
- [Document understanding](models/deepseek/document_understanding.py#L4)
111+
105112
### Meta Llama
106113

107114
- [Converse](models/meta_llama/converse.py#L4)
108115
- [ConverseStream](models/meta_llama/converse_stream.py#L4)
116+
- [Document understanding](models/meta_llama/document_understanding.py#L4)
109117
- [InvokeModel](models/meta_llama/llama3_invoke_model.py#L4)
110118
- [InvokeModelWithResponseStream](models/meta_llama/llama3_invoke_model_with_response_stream.py#L4)
111119

112120
### Mistral AI
113121

114122
- [Converse](models/mistral_ai/converse.py#L4)
115123
- [ConverseStream](models/mistral_ai/converse_stream.py#L4)
124+
- [Document understanding](models/mistral_ai/document_understanding.py#L4)
116125
- [InvokeModel](models/mistral_ai/invoke_model.py#L4)
117126
- [InvokeModelWithResponseStream](models/mistral_ai/invoke_model_with_response_stream.py#L4)
118127

Binary file not shown.
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
# SPDX-License-Identifier: Apache-2.0
3+
4+
# snippet-start:[python.example_code.bedrock-runtime.DocumentUnderstanding_AmazonNovaText]
5+
# Send and process a document with Amazon Nova on Amazon Bedrock.
6+
7+
import boto3
8+
from botocore.exceptions import ClientError
9+
10+
# Create a Bedrock Runtime client in the AWS Region you want to use.
11+
client = boto3.client("bedrock-runtime", region_name="us-east-1")
12+
13+
# Set the model ID, e.g. Amazon Nova Lite.
14+
model_id = "amazon.nova-lite-v1:0"
15+
16+
# Load the document
17+
with open("example-data/amazon-nova-service-cards.pdf", "rb") as file:
18+
document_bytes = file.read()
19+
20+
# Start a conversation with a user message and the document
21+
conversation = [
22+
{
23+
"role": "user",
24+
"content": [
25+
{"text": "Briefly compare the models described in this document"},
26+
{
27+
"document": {
28+
# Available formats: html, md, pdf, doc/docx, xls/xlsx, csv, and txt
29+
"format": "pdf",
30+
"name": "Amazon Nova Service Cards",
31+
"source": {"bytes": document_bytes},
32+
}
33+
},
34+
],
35+
}
36+
]
37+
38+
try:
39+
# Send the message to the model, using a basic inference configuration.
40+
response = client.converse(
41+
modelId=model_id,
42+
messages=conversation,
43+
inferenceConfig={"maxTokens": 500, "temperature": 0.3},
44+
)
45+
46+
# Extract and print the response text.
47+
response_text = response["output"]["message"]["content"][0]["text"]
48+
print(response_text)
49+
50+
except (ClientError, Exception) as e:
51+
print(f"ERROR: Can't invoke '{model_id}'. Reason: {e}")
52+
exit(1)
53+
54+
# snippet-end:[python.example_code.bedrock-runtime.DocumentUnderstanding_AmazonNovaText]
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
# SPDX-License-Identifier: Apache-2.0
3+
4+
# snippet-start:[python.example_code.bedrock-runtime.DocumentUnderstanding_AnthropicClaude]
5+
# Send and process a document with Anthropic Claude on Amazon Bedrock.
6+
7+
import boto3
8+
from botocore.exceptions import ClientError
9+
10+
# Create a Bedrock Runtime client in the AWS Region you want to use.
11+
client = boto3.client("bedrock-runtime", region_name="us-east-1")
12+
13+
# Set the model ID, e.g. Claude 3 Haiku.
14+
model_id = "anthropic.claude-3-haiku-20240307-v1:0"
15+
16+
# Load the document
17+
with open("example-data/amazon-nova-service-cards.pdf", "rb") as file:
18+
document_bytes = file.read()
19+
20+
# Start a conversation with a user message and the document
21+
conversation = [
22+
{
23+
"role": "user",
24+
"content": [
25+
{"text": "Briefly compare the models described in this document"},
26+
{
27+
"document": {
28+
# Available formats: html, md, pdf, doc/docx, xls/xlsx, csv, and txt
29+
"format": "pdf",
30+
"name": "Amazon Nova Service Cards",
31+
"source": {"bytes": document_bytes},
32+
}
33+
},
34+
],
35+
}
36+
]
37+
38+
try:
39+
# Send the message to the model, using a basic inference configuration.
40+
response = client.converse(
41+
modelId=model_id,
42+
messages=conversation,
43+
inferenceConfig={"maxTokens": 500, "temperature": 0.3},
44+
)
45+
46+
# Extract and print the response text.
47+
response_text = response["output"]["message"]["content"][0]["text"]
48+
print(response_text)
49+
50+
except (ClientError, Exception) as e:
51+
print(f"ERROR: Can't invoke '{model_id}'. Reason: {e}")
52+
exit(1)
53+
54+
# snippet-end:[python.example_code.bedrock-runtime.DocumentUnderstanding_AnthropicClaude]
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
# SPDX-License-Identifier: Apache-2.0
3+
4+
# snippet-start:[python.example_code.bedrock-runtime.DocumentUnderstanding_CohereCommand]
5+
# Send and process a document with Cohere Command models on Amazon Bedrock.
6+
7+
import boto3
8+
from botocore.exceptions import ClientError
9+
10+
# Create a Bedrock Runtime client in the AWS Region you want to use.
11+
client = boto3.client("bedrock-runtime", region_name="us-east-1")
12+
13+
# Set the model ID, e.g. Command R+.
14+
model_id = "cohere.command-r-plus-v1:0"
15+
16+
# Load the document
17+
with open("example-data/amazon-nova-service-cards.pdf", "rb") as file:
18+
document_bytes = file.read()
19+
20+
# Start a conversation with a user message and the document
21+
conversation = [
22+
{
23+
"role": "user",
24+
"content": [
25+
{"text": "Briefly compare the models described in this document"},
26+
{
27+
"document": {
28+
# Available formats: html, md, pdf, doc/docx, xls/xlsx, csv, and txt
29+
"format": "pdf",
30+
"name": "Amazon Nova Service Cards",
31+
"source": {"bytes": document_bytes},
32+
}
33+
},
34+
],
35+
}
36+
]
37+
38+
try:
39+
# Send the message to the model, using a basic inference configuration.
40+
response = client.converse(
41+
modelId=model_id,
42+
messages=conversation,
43+
inferenceConfig={"maxTokens": 500, "temperature": 0.3},
44+
)
45+
46+
# Extract and print the response text.
47+
response_text = response["output"]["message"]["content"][0]["text"]
48+
print(response_text)
49+
50+
except (ClientError, Exception) as e:
51+
print(f"ERROR: Can't invoke '{model_id}'. Reason: {e}")
52+
exit(1)
53+
54+
# snippet-end:[python.example_code.bedrock-runtime.DocumentUnderstanding_CohereCommand]
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
# SPDX-License-Identifier: Apache-2.0
3+
4+
# snippet-start:[python.example_code.bedrock-runtime.DocumentUnderstanding_DeepSeek]
5+
# Send and process a document with DeepSeek on Amazon Bedrock.
6+
7+
import boto3
8+
from botocore.exceptions import ClientError
9+
10+
# Create a Bedrock Runtime client in the AWS Region you want to use.
11+
client = boto3.client("bedrock-runtime", region_name="us-east-1")
12+
13+
# Set the model ID, e.g. DeepSeek-R1
14+
model_id = "us.deepseek.r1-v1:0"
15+
16+
# Load the document
17+
with open("example-data/amazon-nova-service-cards.pdf", "rb") as file:
18+
document_bytes = file.read()
19+
20+
# Start a conversation with a user message and the document
21+
conversation = [
22+
{
23+
"role": "user",
24+
"content": [
25+
{"text": "Briefly compare the models described in this document"},
26+
{
27+
"document": {
28+
# Available formats: html, md, pdf, doc/docx, xls/xlsx, csv, and txt
29+
"format": "pdf",
30+
"name": "Amazon Nova Service Cards",
31+
"source": {"bytes": document_bytes},
32+
}
33+
},
34+
],
35+
}
36+
]
37+
38+
try:
39+
# Send the message to the model, using a basic inference configuration.
40+
response = client.converse(
41+
modelId=model_id,
42+
messages=conversation,
43+
inferenceConfig={"maxTokens": 2000, "temperature": 0.3},
44+
)
45+
46+
# Extract and print the reasoning and response text.
47+
reasoning, response_text = "", ""
48+
for item in response["output"]["message"]["content"]:
49+
for key, value in item.items():
50+
if key == "reasoningContent":
51+
reasoning = value["reasoningText"]["text"]
52+
elif key == "text":
53+
response_text = value
54+
55+
print(f"\nReasoning:\n{reasoning}")
56+
print(f"\nResponse:\n{response_text}")
57+
58+
except (ClientError, Exception) as e:
59+
print(f"ERROR: Can't invoke '{model_id}'. Reason: {e}")
60+
exit(1)
61+
62+
# snippet-end:[python.example_code.bedrock-runtime.DocumentUnderstanding_DeepSeek]

0 commit comments

Comments
 (0)