Skip to content

Commit 7765d29

Browse files
committed
Maas terminology updates
1 parent 7c9fee9 commit 7765d29

11 files changed

+48
-49
lines changed

articles/machine-learning/concept-data-privacy.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -26,25 +26,25 @@ When you deploy models in Azure Machine Learning, the following types of data ar
2626

2727
## Generate inferencing outputs with real-time endpoints
2828

29-
Deploying models to managed online endpoints deploys model weights to dedicated Virtual Machines and exposes a REST API for real-time inference. Learn more about deploying models from the [Model Catalog to real-time endpoints](concept-model-catalog.md). You manage the infrastructure for these real-time endpoints, and Azures data, privacy, and security commitments apply. Learn more about [Azure compliance offerings](https://servicetrust.microsoft.com/DocumentPage/7adf2d9e-d7b5-4e71-bad8-713e6a183cf3) applicable to Azure Machine Learning.
29+
Deploying models to managed compute deploys model weights to dedicated Virtual Machines and exposes a REST API for real-time inference. Learn more about deploying models from the [Model Catalog to real-time endpoints](concept-model-catalog.md). You manage the infrastructure for these real-time endpoints, and Azure's data, privacy, and security commitments apply. Learn more about [Azure compliance offerings](https://servicetrust.microsoft.com/DocumentPage/7adf2d9e-d7b5-4e71-bad8-713e6a183cf3) applicable to Azure Machine Learning.
3030

31-
Although containers for models Curated by Azure AI have been scanned for vulnerabilities that could exfiltrate data, not all models available through the Model Catalog have been scanned. To reduce the risk of data exfiltration, you can protect your deployment using virtual networks. Follow this link to [learn more](./how-to-network-isolation-model-catalog.md). You can also use [Azure Policy](./how-to-regulate-registry-deployments.md) to regulate the models that can be deployed by your users.
31+
Although containers for models "Curated by Azure AI" have been scanned for vulnerabilities that could exfiltrate data, not all models available through the model catalog have been scanned. To reduce the risk of data exfiltration, you can protect your deployment using virtual networks. Follow this link to [learn more](./how-to-network-isolation-model-catalog.md). You can also use [Azure Policy](./how-to-regulate-registry-deployments.md) to regulate the models that can be deployed by your users.
3232

3333
:::image type="content" source="media/concept-data-privacy/platform-service.png" alt-text="A diagram showing the platform service life cycle." lightbox="media/concept-data-privacy/platform-service.png":::
3434

35-
## Generate inferencing outputs with pay-as-you-go deployments (Models-as-a-Service)
35+
## Generate inferencing outputs with serverless APIs (Models-as-a-Service)
3636

37-
When you deploy a model from the Model Catalog (base or finetuned) using pay-as-you-go deployments for inferencing, an API is provisioned giving you access to the model hosted and managed by the Azure Machine Learning Service. Learn more about [Models-as-a-Service](concept-model-catalog.md). The model processes your input prompts and generates outputs based on the functionality of the model, as described in the model details provided for the model. While the model is provided by the model provider, and your use of the model (and the model providers accountability for the model and its outputs) is subject to the license terms provided with the model, Microsoft provides and manages the hosting infrastructure and API endpoint. The models hosted in Models-as-a-Service are subject to Azures data, privacy, and security commitments. Learn more about Azure compliance offerings applicable to Azure Machine Learning [here](https://servicetrust.microsoft.com/DocumentPage/7adf2d9e-d7b5-4e71-bad8-713e6a183cf3).
37+
When you deploy a model from the model catalog (base or finetuned) as a serverless API for inferencing, an API is provisioned giving you access to the model hosted and managed by the Azure Machine Learning Service. Learn more about [Models-as-a-Service](concept-model-catalog.md). The model processes your input prompts and generates outputs based on the functionality of the model, as described in the model details provided for the model. While the model is provided by the model provider, and your use of the model (and the model provider's accountability for the model and its outputs) is subject to the license terms provided with the model, Microsoft provides and manages the hosting infrastructure and API endpoint. The models hosted in Models-as-a-Service are subject to Azure's data, privacy, and security commitments. Learn more about Azure compliance offerings applicable to Azure Machine Learning [here](https://servicetrust.microsoft.com/DocumentPage/7adf2d9e-d7b5-4e71-bad8-713e6a183cf3).
3838

39-
Microsoft acts as the data processor for prompts and outputs sent to and generated by a model deployed for pay-as-you-go inferencing (MaaS). Microsoft does not share these prompts and outputs with the model provider, and Microsoft does not use these prompts and outputs to train or improve Microsofts, the model providers, or any third partys models. Models are stateless and no prompts or outputs are stored in the model. If content filtering is enabled, prompts and outputs are screened for certain categories of harmful content by the Azure AI Content Safety service in real time; learn more about how Azure AI Content Safety processes data [here](/legal/cognitive-services/content-safety/data-privacy). Prompts and outputs are processed within the geography specified during deployment but may be processed between regions within the geography for operational purposes (including performance and capacity management).
39+
Microsoft acts as the data processor for prompts and outputs sent to and generated by a model deployed for pay-as-you-go inferencing (MaaS). Microsoft does not share these prompts and outputs with the model provider, and Microsoft does not use these prompts and outputs to train or improve Microsoft's, the model provider's, or any third party's models. Models are stateless and no prompts or outputs are stored in the model. If content filtering is enabled, prompts and outputs are screened for certain categories of harmful content by the Azure AI Content Safety service in real time; learn more about how Azure AI Content Safety processes data [here](/legal/cognitive-services/content-safety/data-privacy). Prompts and outputs are processed within the geography specified during deployment but may be processed between regions within the geography for operational purposes (including performance and capacity management).
4040

4141
:::image type="content" source="media/concept-data-privacy/model-publisher-cycle.png" alt-text="A diagram showing model publisher service cycle." lightbox="media/concept-data-privacy/model-publisher-cycle.png":::
4242

4343
As explained during the deployment process for Models-as-a-Service, Microsoft may share customer contact information and transaction details (including usage volume associated with the offering) with the model publisher so that they can contact customers regarding the model. Learn more about information available to model publishers, [follow this link](/partner-center/analytics).
4444

45-
## Finetune a model for pay-as-you-go deployment (Models-as-a-Service)
45+
## Finetune a model with serverless APIs (Models-as-a-Service)
4646

47-
If a model available for pay-as-you-go deployment (MaaS) supports finetuning, you can upload data to (or designate data already in) an [Azure Machine Learning Datastore](./concept-data.md) to finetune the model. You can then create a pay-as-you-go deployment for the finetuned model. The finetuned model can't be downloaded, but the finetuned model:
47+
If a model available for serverless API deployment supports finetuning, you can upload data to (or designate data already in) an [Azure Machine Learning Datastore](./concept-data.md) to finetune the model. You can then create a serverless API for the finetuned model. The finetuned model can't be downloaded, but the finetuned model:
4848

4949
* Is available exclusively for your use;
5050

@@ -56,7 +56,7 @@ Training data uploaded for finetuning isn't used to train, retrain, or improve a
5656

5757
## Data processing for downloaded models
5858

59-
If you download a model from the Model Catalog, you choose where to deploy the model, and you're responsible for how data is processed when you use the model.
59+
If you download a model from the model catalog, you choose where to deploy the model, and you're responsible for how data is processed when you use the model.
6060

6161
## Next steps
6262

articles/machine-learning/how-to-deploy-models-cohere-command.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.custom: [references_regions]
1515
#This functionality is also available in Azure AI Studio: /azure/ai-studio/how-to/deploy-models-cohere.md
1616
---
1717
# How to deploy Cohere Command models with Azure Machine Learning studio
18-
Cohere offers two Command models in Azure Machine Learning studio. These models are available with pay-as-you-go token based billing with Models as a Service.
18+
Cohere offers two Command models in Azure Machine Learning studio. These models are available as serverless APIs with pay-as-you-go token-based billing.
1919

2020
* Cohere Command R
2121
* Cohere Command R+
@@ -24,7 +24,7 @@ You can browse the Cohere family of models in the model catalog by filtering on
2424

2525
## Models
2626

27-
In this article, you learn how to use Azure Machine Learning studio to deploy the Cohere Command models as a service with pay-as you go billing.
27+
In this article, you learn how to use Azure Machine Learning studio to deploy the Cohere Command models as a serverless API with pay-as you go billing.
2828

2929
### Cohere Command R
3030
Command R is a highly performant generative large language model, optimized for a variety of use cases including reasoning, summarization, and question answering.
@@ -61,11 +61,11 @@ Pre-training data additionally included the following 13 languages: Russian, Pol
6161

6262
[!INCLUDE [machine-learning-preview-generic-disclaimer](includes/machine-learning-preview-generic-disclaimer.md)]
6363

64-
## Deploy with pay-as-you-go
64+
## Deploy as a serverless API
6565

66-
Certain models in the model catalog can be deployed as a service with pay-as-you-go, providing a way to consume them as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
66+
Certain models in the model catalog can be deployed as a serverless API with pay-as-you-go billing. This method of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
6767

68-
The previously mentioned Cohere models can be deployed as a service with pay-as-you-go, and are offered by Cohere through the Microsoft Azure Marketplace. Cohere can change or update the terms of use and pricing of this model.
68+
The previously mentioned Cohere models can be deployed as a serverless API with pay-as-you-go, and are offered by Cohere through the Microsoft Azure Marketplace. Cohere can change or update the terms of use and pricing of this model.
6969

7070
### Prerequisites
7171

@@ -84,12 +84,12 @@ The previously mentioned Cohere models can be deployed as a service with pay-as-
8484
To create a deployment:
8585

8686
1. Go to [Azure Machine Learning studio](https://ml.azure.com/home).
87-
1. Select the workspace in which you want to deploy your models. To use the pay-as-you-go model deployment offering, your workspace must belong to the EastUS2 or Sweden Central region.
87+
1. Select the workspace in which you want to deploy your models. To use the serverless API deployment offering, your workspace must belong to the EastUS2 or Sweden Central region.
8888
1. Choose the model you want to deploy from the [model catalog](https://ml.azure.com/model/catalog).
8989

9090
Alternatively, you can initiate deployment by going to your workspace and selecting **Endpoints** > **Serverless endpoints** > **Create**.
9191

92-
1. On the model's overview page in the model catalog, select **Deploy** and then **Pay-as-you-go**.
92+
1. On the model's overview page in the model catalog, select **Deploy**.
9393

9494
:::image type="content" source="media/how-to-deploy-models-cohere-command/command-r-deploy-pay-as-you-go.png" alt-text="A screenshot showing how to deploy a model with the pay-as-you-go option." lightbox="media/how-to-deploy-models-cohere-command/command-r-deploy-pay-as-you-go.png":::
9595

@@ -115,7 +115,7 @@ To create a deployment:
115115

116116
To learn about billing for models deployed with pay-as-you-go, see [Cost and quota considerations for Cohere models deployed as a service](#cost-and-quota-considerations-for-models-deployed-as-a-service).
117117

118-
### Consume the models as a service
118+
### Consume the Cohere models as a service
119119

120120
The previously mentioned Cohere models can be consumed using the chat API.
121121

@@ -126,7 +126,7 @@ The previously mentioned Cohere models can be consumed using the chat API.
126126

127127
For more information on using the APIs, see the [reference](#reference-for-cohere-models-deployed-as-a-service) section.
128128

129-
## Reference for Cohere models deployed as a service
129+
## Reference for Cohere models deployed as a serverless API
130130

131131
Cohere Command R and Command R+ models accept both the [Azure AI Model Inference API](reference-model-inference-api.md) on the route `/chat/completions` and the native [Cohere Chat API](#cohere-chat-api) on `/v1/chat`.
132132

articles/machine-learning/how-to-deploy-models-cohere-embed.md

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ ms.custom: [references_regions]
1616
---
1717

1818
# How to deploy Cohere Embed models with Azure Machine Learning studio
19-
Cohere offers two Embed models in Azure Machine Learning studio. These models are available with pay-as-you-go token based billing with Models as a Service.
19+
Cohere offers two Embed models in Azure Machine Learning studio. These models are available as serverless APIs with pay-as-you-go, token-based billing.
2020

2121
* Cohere Embed v3 - English
2222
* Cohere Embed v3 - Multilingual
@@ -25,7 +25,7 @@ You can browse the Cohere family of models in the model catalog by filtering on
2525

2626
## Models
2727

28-
In this article, you learn how to use Azure Machine Learning studio to deploy the Cohere models as a service with pay-as you go billing.
28+
In this article, you learn how to use Azure Machine Learning studio to deploy the Cohere models as a serverless API with pay-as you go billing.
2929

3030
### Cohere Embed v3 - English
3131
Cohere Embed English is the market's leading text representation model used for semantic search, retrieval-augmented generation (RAG), classification, and clustering. Embed English has top performance on the HuggingFace MTEB benchmark and performs well on various industries such as Finance, Legal, and General-Purpose Corpora.
@@ -41,9 +41,8 @@ Cohere Embed Multilingual is the market's leading text representation model used
4141

4242
[!INCLUDE [machine-learning-preview-generic-disclaimer](includes/machine-learning-preview-generic-disclaimer.md)]
4343

44-
## Deploy with pay-as-you-go
45-
46-
Certain models in the model catalog can be deployed as a service with pay-as-you-go, providing a way to consume them as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
44+
## Deploy as a serverless API
45+
Certain models in the model catalog can be deployed as a serverless API with pay-as-you-go billing, providing a way to consume them as an API without hosting them on your subscription, while keeping the enterprise security and compliance organizations need. This deployment option doesn't require quota from your subscription.
4746

4847
The previously mentioned Cohere models can be deployed as a service with pay-as-you-go, and are offered by Cohere through the Microsoft Azure Marketplace. Cohere can change or update the terms of use and pricing of this model.
4948

@@ -69,7 +68,7 @@ To create a deployment:
6968

7069
Alternatively, you can initiate deployment by going to your workspace and selecting **Endpoints** > **Serverless endpoints** > **Create**.
7170

72-
1. On the model's overview page in the model catalog, select **Deploy** and then **Pay-as-you-go**.
71+
1. On the model's overview page in the model catalog, select **Deploy**.
7372

7473
:::image type="content" source="media/how-to-deploy-models-cohere-embed/embed-english-deploy-pay-as-you-go.png" alt-text="A screenshot showing how to deploy a model with the pay-as-you-go option." lightbox="media/how-to-deploy-models-cohere-embed/embed-english-deploy-pay-as-you-go.png":::
7574

@@ -95,7 +94,7 @@ To create a deployment:
9594

9695
To learn about billing for models deployed with pay-as-you-go, see [Cost and quota considerations for Cohere models deployed as a service](#cost-and-quota-considerations-for-models-deployed-as-a-service).
9796

98-
### Consume the models as a service
97+
### Consume the models deployed as a serverless API
9998

10099
The previously mentioned Cohere models can be consumed using the chat API.
101100

@@ -106,7 +105,7 @@ The previously mentioned Cohere models can be consumed using the chat API.
106105

107106
For more information on using the APIs, see the [reference](#embed-api-reference-for-cohere-embed-models-deployed-as-a-service) section.
108107

109-
## Embed API reference for Cohere Embed models deployed as a service
108+
## Embed API reference for Cohere Embed models deployed as a serverless API
110109

111110
Cohere Embed v3 - English and Embed v3 - Multilingual accept both the [Azure AI Model Inference API](reference-model-inference-api.md) on the route `/embeddings` (for text) and `/images/embeddings` (for images), and the native [Cohere Embed v3 API](#cohere-embed-v3) on `/embed`.
112111

0 commit comments

Comments
 (0)