diff --git a/blog/2025-04-11-mar-product-updates.md b/blog/2025-04-11-mar-product-updates.md index 1aa7727d7..6acd1afa0 100644 --- a/blog/2025-04-11-mar-product-updates.md +++ b/blog/2025-04-11-mar-product-updates.md @@ -26,7 +26,7 @@ Wow - another month has gone by, time flies when you're having fun! Let us share some important updates regarding what we achieved at Defang in March: -**Managed LLMs:** One of the coolest features we have released in a bit is [support for Managed LLMs (such as AWS Bedrock) through the `x-defang-llm` compose service extension](https://docs.defang.io/docs/concepts/managed-llms/managed-language-models). When coupled with the `defang/openai-access-gateway` service image, Defang offers the easiest way to [migrate your OpenAI-compatible application to cloud-native managed LLMs](https://docs.defang.io/docs/tutorials/deploying-openai-apps-aws-bedrock) without making any changes to your code. Support for GCP and DigitalOcean coming soon. +**Managed LLMs:** One of the coolest features we have released in a bit is [support for Managed LLMs (such as AWS Bedrock) through the `x-defang-llm` compose service extension](https://docs.defang.io/docs/concepts/managed-llms/managed-language-models). When coupled with the `defang/openai-access-gateway` service image, Defang offers the easiest way to [migrate your OpenAI-compatible application to cloud-native managed LLMs](https://docs.defang.io/docs/tutorials/deploying-openai-apps) without making any changes to your code. Support for GCP and DigitalOcean coming soon. **[Update: April 30, 2025 - GCP support is now enabled!]** **Defang Pulumi Provider:** Last month, we announced a preview of the [Defang Pulumi Provider](https://github.com/DefangLabs/pulumi-defang), and this month we are excited to announce that V1 is now available in the [Pulumi Registry](https://www.pulumi.com/registry/packages/defang/). As much as we love Docker, we realize there are many real-world apps that have components that (currently) cannot be described completely in a Compose file. With the Defang Pulumi Provider, you can now leverage [the declarative simplicity of Defang with the imperative power of Pulumi](https://docs.defang.io/docs/concepts/pulumi#when-to-use-the-defang-pulumi-provider). diff --git a/docs/concepts/managed-llms/openai-access-gateway.md b/docs/concepts/managed-llms/openai-access-gateway.md index e6c940e5c..f9be0eccd 100644 --- a/docs/concepts/managed-llms/openai-access-gateway.md +++ b/docs/concepts/managed-llms/openai-access-gateway.md @@ -8,7 +8,7 @@ sidebar_position: 3000 Defang makes it easy to deploy on your favourite cloud's managed LLM service with our [OpenAI Access Gateway](https://github.com/DefangLabs/openai-access-gateway). This service sits between your application and the cloud service and acts as a compatibility layer. It handles incoming OpenAI requests, translates those requests to the appropriate cloud-native API, handles the native response, and re-constructs an OpenAI-compatible response. -See [our tutorial](/docs/tutorials/deploying-openai-apps-aws-bedrock-gcp-vertex/) which describes how to configure the OpenAI Access Gateway for your application +See [our tutorial](/docs/tutorials/deploying-openai-apps/) which describes how to configure the OpenAI Access Gateway for your application ## Current Support diff --git a/docs/tutorials/deploying-openai-apps-aws-bedrock-gcp-vertex.mdx b/docs/tutorials/deploying-openai-apps.mdx similarity index 100% rename from docs/tutorials/deploying-openai-apps-aws-bedrock-gcp-vertex.mdx rename to docs/tutorials/deploying-openai-apps.mdx