Skip to content

Commit f69cbcd

Browse files
committed
edit readme
1 parent c761ead commit f69cbcd

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

samples/managed-llm/README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ You can configure the `MODEL` and `ENDPOINT_URL` for the LLM separately for loca
1010
* The `MODEL` is the LLM Model ID you are using.
1111
* The `ENDPOINT_URL` is the bridge that provides authenticated access to the LLM model.
1212

13+
Ensure you have the necessary permissions to access the model you intend to use. For example, if you are using AWS Bedrock, verify that your account has [model access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html).
14+
1315
### Defang OpenAI Access Gateway
1416

1517
In the `compose.yaml` file, the `llm` service is used to route requests to the LLM API model. This is known as the Defang OpenAI Access Gateway.
@@ -37,7 +39,7 @@ For this sample, you will need to provide the following [configuration](https://
3739
> Note that if you are using the 1-click deploy option, you can set these values as secrets in your GitHub repository and the action will automatically deploy them for you.
3840
3941
### `MODEL`
40-
The Model ID of the LLM you are using for your application. For example, `anthropic.claude-3-5-haiku-20241022-v1:0`.
42+
The Model ID of the LLM you are using for your application. For example, `anthropic.claude-3-haiku-20240307-v1:0`.
4143
```bash
4244
defang config set MODEL
4345
```

0 commit comments

Comments
 (0)