You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/deployments-overview.md
+6-4Lines changed: 6 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,18 +21,20 @@ The model catalog in Azure AI Foundry portal is the hub to discover and use a wi
21
21
22
22
## Deploying models
23
23
24
-
Deployment options vary depending on the model type:
24
+
Deployment options vary depending on the model offering:
25
25
26
-
***Azure OpenAI models:** The latest OpenAI models that have enterprise features from Azure.
27
-
***Models as a Service models:** These models don't require compute quota from your subscription. This option allows you to deploy your Model as a Service (MaaS). You use a serverless API deployment and are billed per token in a pay-as-you-go fashion.
28
-
***Open and custom models:** The model catalog offers access to a large variety of models across modalities that are of open access. You can host open models in your own subscription with a managed infrastructure, virtual machines, and the number of instances for capacity management. There's a wide range of models from Azure OpenAI, Hugging Face, and NVIDIA.
26
+
***Azure OpenAI models:** The latest OpenAI models that have enterprise features from Azure with flexible billing options.
27
+
***Models-as-a-Service models:** These models don't require compute quota from your subscriptionand are billed per token in a pay-as-you-go fashion.
28
+
***Open and custom models:** The model catalog offers access to a large variety of models across modalities, including models of open access. You can host open models in your own subscription with a managed infrastructure, virtual machines, and the number of instances for capacity management.
29
29
30
30
Azure AI Foundry offers four different deployment options:
31
31
32
32
|Name | Azure OpenAI service | Azure AI model inference | Serverless API | Managed compute |
| Which models can be deployed? |[Azure OpenAI models](../../ai-services/openai/concepts/models.md)|[Azure OpenAI models and Models as a Service](../../ai-foundry/model-inference/concepts/models.md)|[Models as a Service](../how-to/model-catalog-overview.md#content-safety-for-models-deployed-via-serverless-apis)|[Open and custom models](../how-to/model-catalog-overview.md#availability-of-models-for-deployment-as-managed-compute)|
35
35
| Deployment resource | Azure OpenAI resource | Azure AI services resource | AI project resource | AI project resource |
36
+
| Requires Hubs/Projects | No | No | Yes | Yes |
37
+
| Data processing options | Regional, Data-zone, and Global | Global | Regional | Regional |
36
38
| Best suited when | You are planning to use only OpenAI models | You are planning to take advantage of the flagship models in Azure AI catalog, including OpenAI. | You are planning to use a single model from a specific provider (excluding OpenAI). | If you plan to use open models and you have enough compute quota available in your subscription. |
| Deployment instructions |[Deploy to Azure OpenAI Service](../how-to/deploy-models-openai.md)|[Deploy to Azure AI model inference](../model-inference/how-to/create-model-deployments.md)|[Deploy to Serverless API](../how-to/deploy-models-serverless.md)|[Deploy to Managed compute](../how-to/deploy-models-managed.md)|
0 commit comments