Skip to content

Commit 3506255

Browse files
Merge pull request #1279 from msakande/ignite-deployment-overview-update
Ignite updates for deployment overview article
2 parents 9171952 + 0ffd96d commit 3506255

File tree

1 file changed

+10
-11
lines changed

1 file changed

+10
-11
lines changed

articles/ai-studio/concepts/deployments-overview.md

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -28,12 +28,12 @@ Deployment options vary depending on the model type:
2828

2929
Azure AI studio offers four different deployment options:
3030

31-
|Name | Azure OpenAI Service | Azure AI model inference service | Serverless API | Managed compute |
31+
|Name | Azure OpenAI service | Azure AI model inference service | Serverless API | Managed compute |
3232
|-------------------------------|----------------------|-------------------|----------------|-----------------|
3333
| Which models can be deployed? | [Azure OpenAI models](../../ai-services/openai/concepts/models.md) | [Azure OpenAI models and Models as a Service](../ai-services/model-inference.md#models) | [Models as a Service](../how-to/model-catalog-overview.md#content-safety-for-models-deployed-via-serverless-apis) | [Open and custom models](../how-to/model-catalog-overview.md#availability-of-models-for-deployment-as-managed-compute) |
34-
| Deployment resource | Azure OpenAI service | Azure AI services | AI project | AI project |
34+
| Deployment resource | Azure OpenAI resource | Azure AI services resource | AI project resource | AI project resource |
3535
| Best suited when | You are planning to use only OpenAI models | You are planning to take advantage of the flagship models in Azure AI catalog, including OpenAI. | You are planning to use a single model from a specific provider (excluding OpenAI). | If you plan to use open models and you have enough compute quota available in your subscription. |
36-
| Billing bases | Token usage | Token usage | Token usage<sup>1</sup> | Compute core hours<sup>2</sup> |
36+
| Billing bases | Token usage & PTU | Token usage | Token usage<sup>1</sup> | Compute core hours<sup>2</sup> |
3737
| Deployment instructions | [Deploy to Azure OpenAI Service](../how-to/deploy-models-openai.md) | [Deploy to Azure AI model inference](../ai-services/how-to/create-model-deployments.md) | [Deploy to Serverless API](../how-to/deploy-models-serverless.md) | [Deploy to Managed compute](../how-to/deploy-models-managed.md) |
3838

3939
<sup>1</sup> A minimal endpoint infrastructure is billed per minute. You aren't billed for the infrastructure that hosts the model in pay-as-you-go. After you delete the endpoint, no further charges accrue.
@@ -51,19 +51,18 @@ Azure AI studio encourages customers to explore the deployment options and pick
5151

5252
2. When you are looking to use a specific model:
5353

54-
1. When you are interested in OpenAI models, use the Azure OpenAI Service which offers a wide range of capabilities for them and it's designed for them.
54+
1. When you are interested in Azure OpenAI models, use the Azure OpenAI Service which offers a wide range of capabilities for them and it's designed for them.
5555

5656
2. When you are interested in a particular model from Models as a Service, and you don't expect to use any other type of model, use [Serverless API endpoints](../how-to/deploy-models-serverless.md). They allow deployment of a single model under a unique set of endpoint URL and keys.
5757

58-
3. When your model is not available in Models as a Service and you have compute quota available in your subscription, use [Managed Compute](../how-to/deploy-models-managed.md) which support deployment of open and custom models. It also allows high level of customization of the deployment inference server, protocols, and detailed configuration.
58+
3. When your model is not available in Models as a Service and you have compute quota available in your subscription, use [Managed Compute](../how-to/deploy-models-managed.md) which support deployment of open and custom models. It also allows high level of customization of the deployment inference server, protocols, and detailed configuration.
5959

6060
> [!TIP]
61-
> Each deployment option may offer different capabilities in terms of networking, security, and additional features like content safety. Review the documentation for each of them to understand their limitations.
62-
61+
> Each deployment option may offer different capabilities in terms of networking, security, and additional features like content safety. Review the documentation for each of them to understand their limitations.
6362
6463
## Related content
6564

66-
- [Add and configure models to the Azure AI model inference service](../ai-services/how-to/create-model-deployments.md)
67-
- [Deploy Azure OpenAI models with Azure AI Studio](../how-to/deploy-models-openai.md)
68-
- [Deploy open models with Azure AI Studio](../how-to/deploy-models-open.md)
69-
- [Model catalog and collections in Azure AI Studio](../how-to/model-catalog-overview.md)
65+
* [Add and configure models to the Azure AI model inference service](../ai-services/how-to/create-model-deployments.md)
66+
* [Deploy Azure OpenAI models with Azure AI Studio](../how-to/deploy-models-openai.md)
67+
* [Deploy open models with Azure AI Studio](../how-to/deploy-models-open.md)
68+
* [Model catalog and collections in Azure AI Studio](../how-to/model-catalog-overview.md)

0 commit comments

Comments
 (0)