|
1 | 1 | ---
|
2 |
| -title: Deploying your OpenAI Application to AWS Bedrock |
| 2 | +title: Deploy OpenAI Apps to AWS Bedrock |
3 | 3 | sidebar_position: 50
|
4 | 4 | ---
|
5 | 5 |
|
6 |
| -# Deploying Your OpenAI Application to AWS Bedrock |
| 6 | +# Deploy OpenAI Apps to AWS Bedrock |
7 | 7 |
|
8 | 8 | Let's assume you have an app that uses an OpenAI client library and you want to deploy it to the cloud on **AWS Bedrock**.
|
9 | 9 |
|
@@ -50,7 +50,7 @@ Add **Defang's [openai-access-gateway](https://github.com/DefangLabs/openai-acce
|
50 | 50 | - The container image is based on [aws-samples/bedrock-access-gateway](https://github.com/aws-samples/bedrock-access-gateway), with enhancements.
|
51 | 51 | - `x-defang-llm: true` signals to **Defang** that this service should be configured to use target platform AI services.
|
52 | 52 | - New environment variables:
|
53 |
| - - `REGION` is the zone where the services runs (for AWS this is the equvilent of AWS_REGION) |
| 53 | + - `REGION` is the zone where the services runs (for AWS, this is the equivalent of AWS_REGION) |
54 | 54 |
|
55 | 55 | :::tip
|
56 | 56 | **OpenAI Key**
|
@@ -110,9 +110,9 @@ Choose the correct `MODEL` depending on which cloud provider you are using.
|
110 | 110 | - For **AWS Bedrock**, use a Bedrock model ID (e.g., `anthropic.claude-3-sonnet-20240229-v1:0`) [See available Bedrock models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html).
|
111 | 111 | :::
|
112 | 112 |
|
113 |
| -Alternatively, Defang supports model mapping through the openai-access-gateway. This takes a model with a Docker naming convention (e.g. `ai/lama3.3`) and maps it to |
114 |
| -the closest equilavent on the target platform. If no such match can be found a fallback can be defined to use a known existing model (e.g. ai/mistral). These environment |
115 |
| -variables are USE_MODEL_MAPPING (default to true) and FALLBACK_MODEL (no default), respectively. |
| 113 | +Alternatively, Defang supports model mapping through the [openai-access-gateway](https://github.com/DefangLabs/openai-access-gateway). This takes a model with a Docker naming convention (e.g. `ai/llama3.3`) and maps it to |
| 114 | +the closest equivalent on the target platform. If no such match can be found ,a fallback can be defined to use a known existing model (e.g. `ai/mistral`). These environment |
| 115 | +variables are `USE_MODEL_MAPPING` (default to true) and `FALLBACK_MODEL` (no default), respectively. |
116 | 116 |
|
117 | 117 |
|
118 | 118 | :::info
|
@@ -151,7 +151,7 @@ services:
|
151 | 151 | | Variable | AWS Bedrock |
|
152 | 152 | |--------------------|-------------|
|
153 | 153 | | `REGION` | Required|
|
154 |
| -| `MODEL` | Bedrock model ID or Docker model name, for example `meta.llama3-3-70b-instruct-v1:0` or `ai/lama3.3` | |
| 154 | +| `MODEL` | Bedrock model ID or Docker model name, for example `meta.llama3-3-70b-instruct-v1:0` or `ai/llama3.3` | |
155 | 155 |
|
156 | 156 | ---
|
157 | 157 |
|
|
0 commit comments