Skip to content

Commit 56dc742

Browse files
authored
Update deployments-overview.md
1 parent 9df29af commit 56dc742

File tree

1 file changed

+6
-4
lines changed

1 file changed

+6
-4
lines changed

articles/ai-foundry/concepts/deployments-overview.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,18 +21,20 @@ The model catalog in Azure AI Foundry portal is the hub to discover and use a wi
2121

2222
## Deploying models
2323

24-
Deployment options vary depending on the model type:
24+
Deployment options vary depending on the model offering:
2525

26-
* **Azure OpenAI models:** The latest OpenAI models that have enterprise features from Azure.
27-
* **Models as a Service models:** These models don't require compute quota from your subscription. This option allows you to deploy your Model as a Service (MaaS). You use a serverless API deployment and are billed per token in a pay-as-you-go fashion.
28-
* **Open and custom models:** The model catalog offers access to a large variety of models across modalities that are of open access. You can host open models in your own subscription with a managed infrastructure, virtual machines, and the number of instances for capacity management. There's a wide range of models from Azure OpenAI, Hugging Face, and NVIDIA.
26+
* **Azure OpenAI models:** The latest OpenAI models that have enterprise features from Azure with flexible billing options.
27+
* **Models-as-a-Service models:** These models don't require compute quota from your subscription and are billed per token in a pay-as-you-go fashion.
28+
* **Open and custom models:** The model catalog offers access to a large variety of models across modalities, including models of open access. You can host open models in your own subscription with a managed infrastructure, virtual machines, and the number of instances for capacity management.
2929

3030
Azure AI Foundry offers four different deployment options:
3131

3232
|Name | Azure OpenAI service | Azure AI model inference | Serverless API | Managed compute |
3333
|-------------------------------|----------------------|-------------------|----------------|-----------------|
3434
| Which models can be deployed? | [Azure OpenAI models](../../ai-services/openai/concepts/models.md) | [Azure OpenAI models and Models as a Service](../../ai-foundry/model-inference/concepts/models.md) | [Models as a Service](../how-to/model-catalog-overview.md#content-safety-for-models-deployed-via-serverless-apis) | [Open and custom models](../how-to/model-catalog-overview.md#availability-of-models-for-deployment-as-managed-compute) |
3535
| Deployment resource | Azure OpenAI resource | Azure AI services resource | AI project resource | AI project resource |
36+
| Requires Hubs/Projects | No | No | Yes | Yes |
37+
| Data processing options | Regional, Data-zone, and Global | Global | Regional | Regional |
3638
| Best suited when | You are planning to use only OpenAI models | You are planning to take advantage of the flagship models in Azure AI catalog, including OpenAI. | You are planning to use a single model from a specific provider (excluding OpenAI). | If you plan to use open models and you have enough compute quota available in your subscription. |
3739
| Billing bases | Token usage & PTU | Token usage | Token usage<sup>1</sup> | Compute core hours<sup>2</sup> |
3840
| Deployment instructions | [Deploy to Azure OpenAI Service](../how-to/deploy-models-openai.md) | [Deploy to Azure AI model inference](../model-inference/how-to/create-model-deployments.md) | [Deploy to Serverless API](../how-to/deploy-models-serverless.md) | [Deploy to Managed compute](../how-to/deploy-models-managed.md) |

0 commit comments

Comments
 (0)