Skip to content

Commit 7778339

Browse files
committed
merge updates to links
1 parent f752b41 commit 7778339

File tree

6 files changed

+17
-25
lines changed

6 files changed

+17
-25
lines changed

blog/2025-04-11-mar-product-updates.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Wow - another month has gone by, time flies when you're having fun!
2525

2626
Let us share some important updates regarding what we achieved at Defang in March:
2727

28-
**Managed LLMs:** One of the coolest features we have released in a bit is [support for Managed LLMs (such as AWS Bedrock) through the `x-defang-llm` compose service extension](https://docs.defang.io/docs/concepts/managed-llms/managed-language-models). When coupled with the `defang/openai-access-gateway` service image, Defang offers the easiest way to [migrate your OpenAI-compatible application to cloud-native managed LLMs](https://docs.defang.io/docs/tutorials/deploying-openai-apps) without making any changes to your code. Support for GCP and DigitalOcean coming soon.
28+
**Managed LLMs:** One of the coolest features we have released in a bit is [support for Managed LLMs (such as AWS Bedrock) through the `x-defang-llm` compose service extension](https://docs.defang.io/docs/concepts/managed-llms/managed-language-models). When coupled with the `defang/openai-access-gateway` service image, Defang offers the easiest way to [migrate your OpenAI-compatible application to cloud-native managed LLMs](https://docs.defang.io/docs/tutorials/deploy-openai-apps) without making any changes to your code. Support for GCP and DigitalOcean coming soon.
2929

3030
**Defang Pulumi Provider:** Last month, we announced a preview of the [Defang Pulumi Provider](https://github.com/DefangLabs/pulumi-defang), and this month we are excited to announce that V1 is now available in the [Pulumi Registry](https://www.pulumi.com/registry/packages/defang/). As much as we love Docker, we realize there are many real-world apps that have components that (currently) cannot be described completely in a Compose file. With the Defang Pulumi Provider, you can now leverage [the declarative simplicity of Defang with the imperative power of Pulumi](https://docs.defang.io/docs/concepts/pulumi#when-to-use-the-defang-pulumi-provider).
3131

docs/concepts/managed-llms/openai-access-gateway.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ sidebar_position: 3000
99
Defang makes it easy to deploy on your favourite cloud's managed LLM service with our [OpenAI Access Gateway](https://github.com/DefangLabs/openai-access-gateway). This service sits between your application and the cloud service and acts as a compatibility layer.
1010
It handles incoming OpenAI requests, translates those requests to the appropriate cloud-native API, handles the native response, and re-constructs an OpenAI-compatible response.
1111

12-
See [our tutorial](/docs/tutorials/deploying-openai-apps) which describes how to configure the OpenAI Access Gateway for your application
12+
See [our tutorial](/docs/tutorials/deploy-openai-apps) which describes how to configure the OpenAI Access Gateway for your application
1313

1414
## Docker Provider Services
1515

File renamed without changes.

docs/tutorials/deploying-openai-apps-gcp-vertex.mdx renamed to docs/tutorials/deploy-openai-apps-gcp-vertex.mdx

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -112,12 +112,7 @@ To do this, you can check your [AWS Bedrock model access](https://docs.aws.amazo
112112
:::info
113113
**Choosing the Right Model**
114114

115-
<<<<<<< HEAD:docs/tutorials/deploying-openai-apps-aws-bedrock-gcp-vertex.mdx
116-
- For **AWS Bedrock**, use a Bedrock model ID (e.g., `anthropic.claude-3-sonnet-20240229-v1:0`) [See available Bedrock models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html).
117-
- For **GCP Vertex AI**, use a full model path (e.g., `google/gemini-2.5-pro-preview-03-25`) [See available Vertex models](https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/call-vertex-using-openai-library#client-setup).
118-
=======
119115
- For **GCP Vertex AI**, use a full model path (e.g., `google/gemini-2.5-pro-preview-03-25`) [See available Vertex models](https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/call-vertex-using-openai-library#client-setup)
120-
>>>>>>> eric-update-for-play-ground:docs/tutorials/deploying-openai-apps-gcp-vertex.mdx
121116
:::
122117

123118
Alternatively, Defang supports model mapping through the openai-access-gateway. This takes a model with a Docker naming convention (e.g. ai/lama3.3) and maps it to
@@ -172,8 +167,5 @@ You now have a single app that can:
172167
- Talk to **GCP Vertex AI**
173168
- Use the same OpenAI-compatible client code
174169
- Easily switch cloud providers by changing a few environment variables
175-
<<<<<<< HEAD:docs/tutorials/deploying-openai-apps-aws-bedrock-gcp-vertex.mdx
176-
=======
177170
:::
178171

179-
>>>>>>> eric-update-for-play-ground:docs/tutorials/deploying-openai-apps-gcp-vertex.mdx
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
---
2+
title: Deploy your OpenAI Apps
3+
sidebar_position: 45
4+
---
5+
6+
# Deploy Your OpenAI Apps
7+
8+
Defang currently supports LLM using AWS Bedrock and GCP Vertex AI. Follow the link below for your specific platform.
9+
10+
- [AWS Bedrock](/docs/tutorials/deploy-openai-apps-aws-bedrock/)
11+
- [GCP Vertex AI](/docs/tutorials/deploy-openai-apps-gcp-vertex/).
12+
13+
14+
15+

docs/tutorials/deploying-openai-apps.mdx

Lines changed: 0 additions & 15 deletions
This file was deleted.

0 commit comments

Comments
 (0)