Skip to content

Commit 383ebea

Browse files
Merge pull request #374 from Portkey-AI/fix/inference-profiles
fix policy statement for inference profiles
2 parents c259fd4 + 2ec85ef commit 383ebea

File tree

1 file changed

+8
-1
lines changed

1 file changed

+8
-1
lines changed

integrations/llms/bedrock/aws-bedrock.mdx

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -635,8 +635,12 @@ Note that you will have to set [`strict_open_ai_compliance=False`](/product/ai-g
635635

636636
[Inference profiles](https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles.html) are a resource in Amazon Bedrock that define a model and one or more Regions to which the inference profile can route model invocation requests.
637637

638-
To use inference profiles, your IAM role needs to have the following permissions:
638+
To use inference profiles, your IAM role needs to additionally have the following permissions:
639639
```json
640+
{
641+
"Version": "2012-10-17",
642+
"Statement": [
643+
{
640644
"Effect": "Allow",
641645
"Action": [
642646
"bedrock:GetInferenceProfile"
@@ -645,6 +649,9 @@ To use inference profiles, your IAM role needs to have the following permissions
645649
"arn:aws:bedrock:*:*:inference-profile/*",
646650
"arn:aws:bedrock:*:*:application-inference-profile/*"
647651
]
652+
}
653+
]
654+
}
648655
```
649656
This is a pre-requisite for using inference profiles, as the gateway needs to fetch the foundation model to process the request.
650657

0 commit comments

Comments
 (0)