Skip to content

Commit 75f3a0b

Browse files
committed
fix
1 parent 8237e51 commit 75f3a0b

File tree

1 file changed

+11
-2
lines changed

1 file changed

+11
-2
lines changed

articles/ai-foundry/model-inference/tutorials/get-started-deepseek-r1.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,9 @@ Azure AI model inference is a capability in Azure AI Services resources in Azure
3737

3838
To create an Azure AI project that supports model inference for DeepSeek-R1, follow these steps:
3939

40+
> [!TIP]
41+
> You can also create the resources using [Azure CLI](../how-to/quickstart-create-resources.md?pivots=programming-language-cli) or [infrastructure as code with Bicep](../how-to/quickstart-create-resources.md?pivots=programming-language-bicep).
42+
4043
1. Go to [Azure AI Foundry portal](https://ai.azure.com) and log in with your account.
4144

4245
2. On the landing page, select **Create project**.
@@ -86,7 +89,7 @@ Let's now create a new model deployment for DeepSeek-R1:
8689
5. The wizard shows the model's terms and conditions for DeepSeek-R1, which is offered as a Microsoft first party consumption service. You can review our privacy and security commitments under [Data, privacy, and Security](../../../ai-studio/how-to/concept-data-privacy.md).
8790

8891
> [!TIP]
89-
> You can also review the pricing details for the model by seeing [Pricing and terms](https://aka.ms/DeepSeekPricing) tab.
92+
> Review the pricing details for the model by selecting [Pricing and terms](https://aka.ms/DeepSeekPricing).
9093
9194
6. Accept the terms on those cases by selecting **Subscribe and deploy**.
9295

@@ -96,7 +99,7 @@ Let's now create a new model deployment for DeepSeek-R1:
9699

97100
8. We automatically select an Azure AI Services connection depending on your project. Use the **Customize** option to change the connection based on your needs. DeepSeek-R1 is currently offered under the **Global Standard** deployment type which offers higher throughput and performance.
98101

99-
9. Select **Deploy**.
102+
9. Select **Deploy**.
100103

101104
:::image type="content" source="../media/quickstart-get-started-deepseek-r1/model-deploy.png" alt-text="Screenshot showing how to deploy the model." lightbox="../media/quickstart-get-started-deepseek-r1/model-deploy.png":::
102105

@@ -122,6 +125,12 @@ You can get started by using the model in the playground to have an idea of the
122125

123126
## Use the model in code
124127

128+
Use the Azure AI model inference endpoint and credentials to connect to the model:
129+
130+
:::image type="content" source="../media/overview/overview-endpoint-and-key.png" alt-text="Screenshot showing how to get the URL and key associated with the resource." lightbox="../media/overview/overview-endpoint-and-key.png":::
131+
132+
You can use the Azure AI Inference package to consume the model in code:
133+
125134
[!INCLUDE [code-create-chat-client](../includes/code-create-chat-client.md)]
126135

127136
[!INCLUDE [code-chat-reasoning](../includes/code-create-chat-reasoning.md)]

0 commit comments

Comments
 (0)