Skip to content

Commit 7997f63

Browse files
committed
edit readme
1 parent ac78978 commit 7997f63

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

samples/managed-llm/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ This sample application demonstrates the use of OpenAI-compatible Managed LLMs (
77

88
> Note: Using Docker Model Provider? See our [*Managed LLM with Docker Model Provider*](https://github.com/DefangLabs/samples/tree/main/samples/managed-llm-provider) sample.
99
10-
The OpenAI-compatible managed LLM feature, provided by the Defang OpenAI Access Gateway, allows users to use AWS Bedrock or Google Cloud Vertex AI with an OpenAI compatible SDK. This enables switching from OpenAI to one of these cloud-native platforms without modifying your application code.
10+
Using the Defang OpenAI Access Gateway, the Managed LLM feature `x-defang-llm: true` allows users to use AWS Bedrock or Google Cloud Vertex AI models with an OpenAI-compatible SDK. This enables switching from OpenAI to one of these cloud-native platforms without modifying your application code. This feature is available in the Defang Playground and Defang BYOC.
1111

1212
You can configure the `MODEL` and `ENDPOINT_URL` for the LLM separately for local development and production environments.
1313
* The `MODEL` is the LLM Model ID you are using.

0 commit comments

Comments
 (0)