You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: integrations/llms/bedrock/aws-bedrock.mdx
+21Lines changed: 21 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -631,6 +631,27 @@ Note that you will have to set [`strict_open_ai_compliance=False`](/product/ai-g
631
631
```
632
632
</CodeGroup>
633
633
634
+
## Inference Profiles
635
+
636
+
[Inference profiles](https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles.html) are a resource in Amazon Bedrock that define a model and one or more Regions to which the inference profile can route model invocation requests.
637
+
638
+
To use inference profiles, your IAM role needs to have the following permissions:
Portkey uses the [AWS Converse API](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) internally for making chat completions requests.
0 commit comments