Skip to content

Commit b382aca

Browse files
committed
llm gateway updates
1 parent a1ff99f commit b382aca

File tree

2 files changed

+11
-11
lines changed

2 files changed

+11
-11
lines changed

docs/concepts/managed-llms/managed-language-models.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,15 @@ sidebar_position: 3000
88

99
Each cloud provider offers their own managed Large Language Model services. AWS offers Bedrock, GCP offers Vertex AI, and Digital Ocean offers their GenAI platform. Defang makes it easy to leverage these services in your projects.
1010

11+
## Current Support
12+
13+
| Provider | Managed Language Models |
14+
| --- | --- |
15+
| [Playground](/docs/providers/playground#managed-large-language-models) ||
16+
| [AWS Bedrock](/docs/providers/aws#managed-large-language-models) ||
17+
| [DigitalOcean GenAI](/docs/providers/digitalocean#future-improvements) ||
18+
| [GCP Vertex AI](/docs/providers/gcp#managed-large-language-models) ||
19+
1120
## Usage
1221

1322
In order to leverage cloud-native managed language models from your Defang services, all you need to do is add the `x-defang-llm` extension to the service config and Defang will configure the approprate roles and permissions for you.
@@ -27,12 +36,3 @@ Assume you have a web service like the following, which uses the cloud native SD
2736
## Deploying OpenAI-compatible apps
2837

2938
If you already have an OpenAI-compatible application, Defang makes it easy to deploy on your favourite cloud's managed LLM service. See our [OpenAI Access Gateway](/docs/concepts/managed-llms/openai-access-gateway)
30-
31-
## Current Support
32-
33-
| Provider | Managed Language Models |
34-
| --- | --- |
35-
| [Playground](/docs/providers/playground#managed-large-language-models) ||
36-
| [AWS Bedrock](/docs/providers/aws#managed-large-language-models) ||
37-
| [DigitalOcean GenAI](/docs/providers/digitalocean#future-improvements) ||
38-
| [GCP Vertex AI](/docs/providers/gcp#managed-large-language-models) ||

docs/tutorials/deploying-openai-apps-aws-bedrock-gcp-vertex.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Deploying your OpenAI Application to AWS Bedrock or GCP Vertex AI
33
sidebar_position: 50
44
---
55

6-
# Deploying your OpenAI Application to AWS Bedrock or GCP Vertex AI
6+
# Deploying Your OpenAI Application to AWS Bedrock or GCP Vertex AI
77

88
Let's assume you have an app that uses an OpenAI client library and you want to deploy it to the cloud, either on **AWS Bedrock** or **GCP Vertex AI**.
99

@@ -154,7 +154,7 @@ services:
154154
|--------------------|-------------|---------------|
155155
| `GCP_PROJECT_ID` | _(not used)_| Required |
156156
| `REGION` | Required| Required |
157-
| `MODEL` | Bedrock model ID | Vertex model path |
157+
| `MODEL` | Bedrock model ID / Docker model name | Vertex model / Docker model name |
158158

159159
---
160160

0 commit comments

Comments
 (0)