Skip to content

Commit 0f7bfed

Browse files
Merge pull request #3691 from santiagxf/santiagxf/deepseek-guide
Improvements to DeepSeek-R1 guidelines
2 parents ae8c38a + 36e0315 commit 0f7bfed

File tree

2 files changed

+10
-6
lines changed

2 files changed

+10
-6
lines changed
812 KB
Loading

articles/ai-foundry/model-inference/tutorials/get-started-deepseek-r1.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -35,15 +35,19 @@ Azure AI model inference is a capability in Azure AI Services resources in Azure
3535

3636
:::image type="content" source="../media/quickstart-get-started-deepseek-r1/resources-architecture.png" alt-text="A diagram showing the high level architecture of the resources created in the tutorial." lightbox="../media/quickstart-get-started-deepseek-r1/resources-architecture.png":::
3737

38-
To create an Azure AI project that supports model inference for DeepSeek-R1, follow these steps:
39-
40-
> [!TIP]
41-
> You can also create the resources using [Azure CLI](../how-to/quickstart-create-resources.md?pivots=programming-language-cli) or [infrastructure as code with Bicep](../how-to/quickstart-create-resources.md?pivots=programming-language-bicep).
38+
To create an Azure AI project that supports model inference for DeepSeek-R1, follow these steps. You can also create the resources using [Azure CLI](../how-to/quickstart-create-resources.md?pivots=programming-language-cli) or [infrastructure as code with Bicep](../how-to/quickstart-create-resources.md?pivots=programming-language-bicep).
4239

4340
1. Go to [Azure AI Foundry portal](https://ai.azure.com) and log in with your account.
4441

4542
2. On the landing page, select **Create project**.
4643

44+
> [!TIP]
45+
> **Are you using Azure OpenAI service?** When you are connected to Azure AI Foundry portal using an Azure OpenAI service resource, only Azure OpenAI models show up in the catalog. To view the full list of models, including DeepSeek-R1, use the top **Announcements** section and locate the card with the option **Explore more models**.
46+
>
47+
> :::image type="content" source="../media/quickstart-get-started-deepseek-r1/explore-more-models.png" alt-text="Screenshot showing the card with the option to explore all the models from the catalog." lightbox="../media/quickstart-get-started-deepseek-r1/explore-more-models.png":::
48+
>
49+
> A new window shows up with the full list of models. Select **DeepSeek-R1** from the list and select **Deploy**. The wizard asks to create a new project.
50+
4751
3. Give the project a name, for example "my-project".
4852

4953
4. In this tutorial, we create a brand new project under a new AI hub, hence, select **Create new hub**. Hubs are containers for multiple projects and allow you to share resources across all the projects.
@@ -135,7 +139,7 @@ You can use the Azure AI Inference package to consume the model in code:
135139

136140
[!INCLUDE [code-chat-reasoning](../includes/code-create-chat-reasoning.md)]
137141

138-
Reasoning may generate longer responses and consume a larger amount of tokens. You can see the [rate limits](../quotas-limits.md) that apply to DeepSeek-R1 models. Consider having a retry strategy to handle rate limits being applied. You can also [request increases to the default limits](../quotas-limits.md#request-increases-to-the-default-limits).
142+
Reasoning may generate longer responses and consume a larger number of tokens. You can see the [rate limits](../quotas-limits.md) that apply to DeepSeek-R1 models. Consider having a retry strategy to handle rate limits being applied. You can also [request increases to the default limits](../quotas-limits.md#request-increases-to-the-default-limits).
139143

140144
### Reasoning content
141145

@@ -184,4 +188,4 @@ In general, reasoning models don't support the following parameters you can find
184188

185189
* [Use chat reasoning models](../how-to/use-chat-reasoning.md)
186190
* [Use image embedding models](../how-to/use-image-embeddings.md)
187-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
191+
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)

0 commit comments

Comments
 (0)