Skip to content

Commit 7013021

Browse files
committed
reword intro paragraph
1 parent 2c935ff commit 7013021

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

samples/managed-llm/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@ This sample application demonstrates the use of OpenAI-compatible Managed LLMs (
77

88
> Note: Using Docker Model Provider? See our [*Managed LLM with Docker Model Provider*](https://github.com/DefangLabs/samples/tree/main/samples/managed-llm-provider) sample.
99
10-
Using the Defang OpenAI Access Gateway, the Managed LLM feature `x-defang-llm: true` allows users to use AWS Bedrock or Google Cloud Vertex AI models with an OpenAI-compatible SDK.
10+
Using the [Defang OpenAI Access Gateway](#defang-openai-access-gateway), the feature `x-defang-llm: true` enables you to use Managed LLMs on the Defang Playground or with BYOC providers (such as AWS Bedrock or GCP Vertex AI) with an OpenAI-compatible SDK.
1111

12-
This enables switching from OpenAI to one of these cloud-native platforms without modifying your application code. This feature is available in the Defang Playground and Defang BYOC.
12+
This allows switching from OpenAI to the Managed LLMs on supported cloud platforms without modifying your application code.
1313

1414
You can configure the `MODEL` and `ENDPOINT_URL` for the LLM separately for local development and production environments.
1515
* The `MODEL` is the LLM Model ID you are using.

0 commit comments

Comments
 (0)