Skip to content

Commit c5a4e70

Browse files
committed
update with azureml links and terminology
1 parent 8825da8 commit c5a4e70

File tree

1 file changed

+24
-25
lines changed

1 file changed

+24
-25
lines changed

articles/machine-learning/how-to-deploy-models-cohere-command.md

Lines changed: 24 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,11 @@ zone_pivot_groups: azure-ai-model-catalog-samples-chat
1717

1818
# How to use Cohere Command chat models with Azure Machine Learning studio
1919

20-
[!INCLUDE [Feature preview](~/reusable-content/ce-skilling/azure/includes/ai-studio/includes/feature-preview.md)]
2120

2221
In this article, you learn about Cohere Command chat models and how to use them.
2322
The Cohere family of models includes various models optimized for different use cases, including chat completions, embeddings, and rerank. Cohere models are optimized for various use cases that include reasoning, summarization, and question answering.
2423

24+
[!INCLUDE [machine-learning-preview-generic-disclaimer](includes/machine-learning-preview-generic-disclaimer.md)]
2525

2626

2727
::: zone pivot="programming-language-python"
@@ -109,18 +109,18 @@ The following models are available:
109109
110110
## Prerequisites
111111

112-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
112+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
113113

114114
### A model deployment
115115

116116
**Deployment to serverless APIs**
117117

118118
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
119119

120-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
120+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
121121

122122
> [!div class="nextstepaction"]
123-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
123+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
124124
125125
### The inference package installed
126126

@@ -143,7 +143,7 @@ Read more about the [Azure AI inference package and reference](https://aka.ms/az
143143
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
144144

145145
> [!TIP]
146-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
146+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
147147
148148
### Create a client to consume the model
149149

@@ -582,18 +582,18 @@ The following models are available:
582582
583583
## Prerequisites
584584

585-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
585+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
586586

587587
### A model deployment
588588

589589
**Deployment to serverless APIs**
590590

591591
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
592592

593-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
593+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
594594

595595
> [!div class="nextstepaction"]
596-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
596+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
597597
598598
### The inference package installed
599599

@@ -614,7 +614,7 @@ npm install @azure-rest/ai-inference
614614
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
615615

616616
> [!TIP]
617-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
617+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
618618
619619
### Create a client to consume the model
620620

@@ -1069,18 +1069,18 @@ The following models are available:
10691069
10701070
## Prerequisites
10711071
1072-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
1072+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
10731073
10741074
### A model deployment
10751075
10761076
**Deployment to serverless APIs**
10771077
10781078
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
10791079
1080-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
1080+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
10811081
10821082
> [!div class="nextstepaction"]
1083-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
1083+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
10841084
10851085
### The inference package installed
10861086
@@ -1124,7 +1124,7 @@ using System.Reflection;
11241124
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
11251125

11261126
> [!TIP]
1127-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
1127+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
11281128

11291129
### Create a client to consume the model
11301130

@@ -1581,18 +1581,18 @@ The following models are available:
15811581
15821582
## Prerequisites
15831583
1584-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
1584+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
15851585
15861586
### A model deployment
15871587
15881588
**Deployment to serverless APIs**
15891589
15901590
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
15911591
1592-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
1592+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
15931593
15941594
> [!div class="nextstepaction"]
1595-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
1595+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
15961596
15971597
### A REST client
15981598
@@ -1606,7 +1606,7 @@ Models deployed with the [Azure AI model inference API](https://aka.ms/azureai/m
16061606
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
16071607
16081608
> [!TIP]
1609-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
1609+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
16101610
16111611
### Create a client to consume the model
16121612
@@ -1898,7 +1898,7 @@ extra-parameters: pass-through
18981898
18991899
Cohere Command chat models support the use of tools, which can be an extraordinary resource when you need to offload specific tasks from the language model and instead rely on a more deterministic system or even a different language model. The Azure AI Model Inference API allows you to define tools in the following way.
19001900
1901-
The following code example creates a tool definition that is able to look from flight information from two different cities.
1901+
The following code example creates a tool definition that is able to look for flight information from two different cities.
19021902
19031903
19041904
```json
@@ -2152,17 +2152,16 @@ For more examples of how to use Cohere models, see the following examples and tu
21522152

21532153
Quota is managed per deployment. Each deployment has a rate limit of 200,000 tokens per minute and 1,000 API requests per minute. However, we currently limit one deployment per model per project. Contact Microsoft Azure Support if the current rate limits aren't sufficient for your scenarios.
21542154
2155-
Cohere models deployed as a serverless API are offered by Cohere through the Azure Marketplace and integrated with Azure AI Studio for use. You can find the Azure Marketplace pricing when deploying the model.
2155+
Cohere models deployed as a serverless API are offered by Cohere through the Azure Marketplace and integrated with Azure Machine Learning studio for use. You can find the Azure Marketplace pricing when deploying the model.
21562156
21572157
Each time a project subscribes to a given offer from the Azure Marketplace, a new resource is created to track the costs associated with its consumption. The same resource is used to track costs associated with inference; however, multiple meters are available to track each scenario independently.
21582158
2159-
For more information on how to track costs, see [Monitor costs for models offered through the Azure Marketplace](costs-plan-manage.md#monitor-costs-for-models-offered-through-the-azure-marketplace).
2159+
For more information on how to track costs, see [Monitor costs for models offered through the Azure Marketplace](/azure/ai-studio/how-to/costs-plan-manage#monitor-costs-for-models-offered-through-the-azure-marketplace).
21602160
21612161
## Related content
21622162
21632163
2164-
* [Azure AI Model Inference API](../reference/reference-model-inference-api.md)
2165-
* [Deploy models as serverless APIs](deploy-models-serverless.md)
2166-
* [Consume serverless API endpoints from a different Azure AI Studio project or hub](deploy-models-serverless-connect.md)
2167-
* [Region availability for models in serverless API endpoints](deploy-models-serverless-availability.md)
2168-
* [Plan and manage costs (marketplace)](costs-plan-manage.md#monitor-costs-for-models-offered-through-the-azure-marketplace)
2164+
* [Azure AI Model Inference API](reference-model-inference-api.md)
2165+
* [Deploy models as serverless APIs](how-to-deploy-models-serverless.md)
2166+
* [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md)
2167+
* [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)

0 commit comments

Comments
 (0)