You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/deploy-models-serverless.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -117,7 +117,7 @@ For non-Microsoft models offered through the Azure Marketplace, you can deploy t
117
117
118
118
# [AI Studio](#tab/azure-ai-studio)
119
119
120
-
1. On the model's **Details** page, select **Deploy** and then select **Serverless API with Azure AI Content Safety** to open the deployment wizard.
120
+
1. On the model's **Details** page, select **Deploy** and then select **Serverless API with Azure AI Content Safety (preview)** to open the deployment wizard.
121
121
122
122
1. Select the project in which you want to deploy your models. To use the serverless API model deployment offering, your project must belong to one of the [regions that are supported for serverless deployment](deploy-models-serverless-availability.md) for the particular model.
123
123
@@ -244,14 +244,14 @@ Once you've created a subscription for a non-Microsoft model, you can deploy the
244
244
245
245
The serverless API endpoint provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
246
246
247
-
In this article, you create an endpoint with the name **meta-llama3-8b-qwerty**.
247
+
In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**.
248
248
249
249
1. Create the serverless endpoint
250
250
251
251
# [AI Studio](#tab/azure-ai-studio)
252
252
253
253
1. To deploy a Microsoft model that doesn't require subscribing to a model offering:
254
-
1. Select **Deploy** and then select **Serverless API with Azure AI Content Safety** to open the deployment wizard.
254
+
1. Select **Deploy** and then select **Serverless API with Azure AI Content Safety (preview)** to open the deployment wizard.
255
255
1. Select the project in which you want to deploy your model. Notice that not all the regions are supported.
256
256
257
257
1. Alternatively, for a non-Microsoft model that requires a model subscription, if you've just subscribed your project to the model offer in the previous section, continue to select **Deploy**. Alternatively, select **Continue to deploy** (if your deployment wizard had the note *You already have an Azure Marketplace subscription for this project*).
In this article, you learn how to deploy a model from the model catalog as a serverless API with pay-as-you-go token-based billing.
19
20
20
-
Certain models in the model catalog can be deployed as a serverless API with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need. This deployment option doesn't require quota from your subscription.
21
+
[Certain models in the model catalog](concept-endpoint-serverless-availability.md) can be deployed as a serverless API with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need. This deployment option doesn't require quota from your subscription.
21
22
22
23
## Prerequisites
23
24
@@ -82,18 +83,15 @@ Certain models in the model catalog can be deployed as a serverless API with pay
82
83
You can use any compatible web browser to [deploy ARM templates](../azure-resource-manager/templates/deploy-portal.md) in the Microsoft Azure portal or using any of the deployment tools. This tutorial uses the [Azure CLI](/cli/azure/).
83
84
84
85
85
-
## Subscribe your workspace to the model offering
86
-
87
-
For models offered through the Azure Marketplace, you can deploy them to serverless API endpoints to consume their predictions. If it's your first time deploying the model in the workspace, you have to subscribe your workspace for the particular model offering from the Azure Marketplace. Each workspace has its own subscription to the particular Azure Marketplace offering of the model, which allows you to control and monitor spending.
88
-
89
-
> [!NOTE]
90
-
> Models offered through the Azure Marketplace are available for deployment to serverless API endpoints in specific regions. Check [Region availability for models in Serverless API endpoints](concept-endpoint-serverless-availability.md) to verify which regions are available. If the one you need is not listed, you can deploy to a workspace in a supported region and then [consume serverless API endpoints from a different workspace](how-to-connect-models-serverless.md).
86
+
## Find your model and model ID in the model catalog
91
87
92
88
1. Sign in to [Azure Machine Learning studio](https://ml.azure.com)
93
89
94
-
1. Ensure your account has the **Azure AI Developer** role permissions on the resource group, or that you meet the [permissions required to subscribe to model offerings](#permissions-required-to-subscribe-to-model-offerings).
90
+
1. For models offered through the Azure Marketplace, ensure that your account has the **Azure AI Developer** role permissions on the resource group, or that you meet the [permissions required to subscribe to model offerings](#permissions-required-to-subscribe-to-model-offerings).
91
+
92
+
Models that are offered by non-Microsoft providers (for example, Llama and Mistral models) are billed through the Azure Marketplace. For such models, you're required to subscribe your project to the particular model offering. Models that are offered by Microsoft (for example, Phi-3 models) don't have this requirement, as billing is done differently. For details about billing for serverless deployment of models in the model catalog, see [Billing for serverless APIs](concept-model-catalog.md#pay-for-model-usage-in-maas).
95
93
96
-
1. Go to your workspace.
94
+
1. Go to your workspace. To use the serverless API model deployment offering, your workspace must belong to one of the [regions that are supported for serverless deployment](concept-endpoint-serverless-availability.md) for the particular model you want to deploy.
97
95
98
96
1. Select **Model catalog** from the left sidebar and find the model card of the model you want to deploy. In this article, you select a **Meta-Llama-3-8B-Instruct** model.
99
97
@@ -104,12 +102,20 @@ For models offered through the Azure Marketplace, you can deploy them to serverl
The next section covers the steps for subscribing your project to a model offering. You can skip this section and go to [Deploy the model to a serverless API endpoint](#deploy-the-model-to-a-serverless-api-endpoint), if you're deploying a Microsoft model.
106
+
107
+
## Subscribe your project to the model offering
108
+
109
+
For non-Microsoft models offered through the Azure Marketplace, you can deploy them to serverless API endpoints to consume their predictions. If it's your first time deploying the model in the project, you have to subscribe your workspace for the particular model offering from the Azure Marketplace. Each workspace has its own subscription to the particular Azure Marketplace offering of the model, which allows you to control and monitor spending.
110
+
111
+
> [!NOTE]
112
+
> Models offered through the Azure Marketplace are available for deployment to serverless API endpoints in specific regions. Check [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md) to verify which models and regions are available. If the one you need is not listed, you can deploy to a workspace in a supported region and then [consume serverless API endpoints from a different workspace](how-to-connect-models-serverless.md).
107
113
108
114
1. Create the model's marketplace subscription. When you create a subscription, you accept the terms and conditions associated with the model offer.
109
115
110
116
# [Studio](#tab/azure-studio)
111
117
112
-
1. On the model's **Details** page, select **Deploy** and then select **Serverless API** to open the deployment wizard.
118
+
1. On the model's **Details** page, select **Deploy** and then select **Serverless API with Azure AI Content Safety (preview)** to open the deployment wizard.
113
119
114
120
1. Select the checkbox to acknowledge the Microsoft purchase policy.
115
121
@@ -194,7 +200,7 @@ For models offered through the Azure Marketplace, you can deploy them to serverl
194
200
}
195
201
```
196
202
197
-
1. Once you sign up the workspace for the particular Azure Marketplace offering, subsequent deployments of the same offering in the same workspace don't require subscribing again.
203
+
1. Once you subscribe the workspace for the particular Azure Marketplace offering, subsequent deployments of the same offering in the same workspace don't require subscribing again.
198
204
199
205
1. At any point, you can see the model offers to which your workspace is currently subscribed:
200
206
@@ -236,15 +242,19 @@ For models offered through the Azure Marketplace, you can deploy them to serverl
236
242
237
243
## Deploy the model to a serverless API endpoint
238
244
239
-
Once you've created a model's subscription, you can deploy the associated model to a serverless API endpoint. The serverless API endpoint provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
245
+
Once you've created a subscription for a non-Microsoft model, you can deploy the associated model to a serverless API endpoint. For Microsoft models (such as Phi-3 models), you don't need to create a subscription.
240
246
241
-
In this article, you create an endpoint with name **meta-llama3-8b-qwerty**.
247
+
The serverless API endpoint provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
248
+
249
+
In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**.
242
250
243
251
1. Create the serverless endpoint
244
252
245
253
# [Studio](#tab/azure-studio)
246
254
247
-
1. From the previous wizard, select **Deploy** (if you've just subscribed the workspace to the model offer in the previous section), or select **Continue to deploy** (if your deployment wizard had the note *You already have an Azure Marketplace subscription for this workspace*).
255
+
1. To deploy a Microsoft model that doesn't require subscribing to a model offering, select **Deploy** and then select **Serverless API with Azure AI Content Safety (preview)** to open the deployment wizard.
256
+
257
+
1. Alternatively, for a non-Microsoft model that requires a model subscription, if you've just subscribed your project to the model offer in the previous section, continue to select **Deploy**. Alternatively, select **Continue to deploy** (if your deployment wizard had the note *You already have an Azure Marketplace subscription for this workspace*).
248
258
249
259
:::image type="content" source="media/how-to-deploy-models-serverless/deploy-pay-as-you-go-subscribed-workspace.png" alt-text="A screenshot showing a workspace that is already subscribed to the offering." lightbox="media/how-to-deploy-models-serverless/deploy-pay-as-you-go-subscribed-workspace.png":::
250
260
@@ -422,11 +432,11 @@ In this article, you create an endpoint with name **meta-llama3-8b-qwerty**.
422
432
> [!TIP]
423
433
> If you're using prompt flow in the same workspace where the deployment was deployed, you still need to create the connection.
424
434
425
-
## Using the serverless API endpoint
435
+
## Use the serverless API endpoint
426
436
427
437
Models deployed in Azure Machine Learning and Azure AI studio in Serverless API endpoints support the [Azure AI Model Inference API](reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
428
438
429
-
Read more about the [capabilities of this API](reference-model-inference-api.md#capabilities) and how [you can leverage it when building applications](reference-model-inference-api.md#getting-started).
439
+
Read more about the [capabilities of this API](reference-model-inference-api.md#capabilities) and how [you can use it when building applications](reference-model-inference-api.md#getting-started).
430
440
431
441
## Delete endpoints and subscriptions
432
442
@@ -501,15 +511,22 @@ az resource delete --name <resource-name>
501
511
502
512
## Cost and quota considerations for models deployed as serverless API endpoints
503
513
504
-
Models deployed as a serverless API endpoint are offered through the Azure Marketplace and integrated with Azure Machine Learning for use. You can find the Azure Marketplace pricing when deploying or fine-tuning the models.
514
+
Quota is managed per deployment. Each deployment has a rate limit of 200,000 tokens per minute and 1,000 API requests per minute. However, we currently limit one deployment per model per workspace. Contact Microsoft Azure Support if the current rate limits aren't sufficient for your scenarios.
515
+
516
+
#### Cost for Microsoft models
517
+
518
+
You can find the pricing information on the __Pricing and terms__ tab of the deployment wizard when deploying Microsoft models (such as Phi-3 models) as serverless API endpoints.
519
+
520
+
#### Cost for non-Microsoft models
521
+
522
+
Non-Microsoft models deployed as serverless API endpoints are offered through the Azure Marketplace and integrated with Azure AI Studio for use. You can find the Azure Marketplace pricing when deploying or fine-tuning these models.
505
523
506
524
Each time a workspace subscribes to a given offer from the Azure Marketplace, a new resource is created to track the costs associated with its consumption. The same resource is used to track costs associated with inference and fine-tuning; however, multiple meters are available to track each scenario independently.
507
525
508
526
For more information on how to track costs, see [Monitor costs for models offered through the Azure Marketplace](../ai-studio/how-to/costs-plan-manage.md#monitor-costs-for-models-offered-through-the-azure-marketplace).
509
527
510
528
:::image type="content" source="media/how-to-deploy-models-serverless/costs-model-as-service-cost-details.png" alt-text="A screenshot showing different resources corresponding to different model offers and their associated meters." lightbox="media/how-to-deploy-models-serverless/costs-model-as-service-cost-details.png":::
511
529
512
-
Quota is managed per deployment. Each deployment has a rate limit of 200,000 tokens per minute and 1,000 API requests per minute. However, we currently limit one deployment per model per workspace. Contact Microsoft Azure Support if the current rate limits aren't sufficient for your scenarios.
513
530
514
531
## Permissions required to subscribe to model offerings
0 commit comments