You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article, you learn about Cohere Command chat models and how to use them.
23
22
The Cohere family of models includes various models optimized for different use cases, including chat completions, embeddings, and rerank. Cohere models are optimized for various use cases that include reasoning, summarization, and question answering.
@@ -109,18 +109,18 @@ The following models are available:
109
109
110
110
## Prerequisites
111
111
112
-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
112
+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
113
113
114
114
### A model deployment
115
115
116
116
**Deployment to serverless APIs**
117
117
118
118
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
119
119
120
-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
120
+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
121
121
122
122
> [!div class="nextstepaction"]
123
-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
123
+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
124
124
125
125
### The inference package installed
126
126
@@ -143,7 +143,7 @@ Read more about the [Azure AI inference package and reference](https://aka.ms/az
143
143
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
144
144
145
145
> [!TIP]
146
-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
146
+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
147
147
148
148
### Create a client to consume the model
149
149
@@ -582,18 +582,18 @@ The following models are available:
582
582
583
583
## Prerequisites
584
584
585
-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
585
+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
586
586
587
587
### A model deployment
588
588
589
589
**Deployment to serverless APIs**
590
590
591
591
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
592
592
593
-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
593
+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
594
594
595
595
> [!div class="nextstepaction"]
596
-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
596
+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
615
615
616
616
> [!TIP]
617
-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
617
+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
618
618
619
619
### Create a client to consume the model
620
620
@@ -1069,18 +1069,18 @@ The following models are available:
1069
1069
1070
1070
## Prerequisites
1071
1071
1072
-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
1072
+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
1073
1073
1074
1074
### A model deployment
1075
1075
1076
1076
**Deployment to serverless APIs**
1077
1077
1078
1078
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
1079
1079
1080
-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
1080
+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
1081
1081
1082
1082
> [!div class="nextstepaction"]
1083
-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
1083
+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
1084
1084
1085
1085
### The inference package installed
1086
1086
@@ -1124,7 +1124,7 @@ using System.Reflection;
1124
1124
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
1125
1125
1126
1126
> [!TIP]
1127
-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
1127
+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
1128
1128
1129
1129
### Create a client to consume the model
1130
1130
@@ -1581,18 +1581,18 @@ The following models are available:
1581
1581
1582
1582
## Prerequisites
1583
1583
1584
-
To use Cohere Command chat models with Azure AI Studio, you need the following prerequisites:
1584
+
To use Cohere Command chat models with Azure Machine Learning, you need the following prerequisites:
1585
1585
1586
1586
### A model deployment
1587
1587
1588
1588
**Deployment to serverless APIs**
1589
1589
1590
1590
Cohere Command chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
1591
1591
1592
-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure AI Studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
1592
+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
1593
1593
1594
1594
> [!div class="nextstepaction"]
1595
-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
1595
+
> [Deploy the model to serverless API endpoints](how-to-deploy-models-serverless.md)
1596
1596
1597
1597
### A REST client
1598
1598
@@ -1606,7 +1606,7 @@ Models deployed with the [Azure AI model inference API](https://aka.ms/azureai/m
1606
1606
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
1607
1607
1608
1608
> [!TIP]
1609
-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Studio with the same code and structure, including Cohere Command chat models.
1609
+
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure Machine Learning with the same code and structure, including Cohere Command chat models.
Cohere Command chat models support the use of tools, which can be an extraordinary resource when you need to offload specific tasks from the language model and instead rely on a more deterministic system or even a different language model. The Azure AI Model Inference API allows you to define tools in the following way.
1900
1900
1901
-
The following code example creates a tool definition that is able to look from flight information from two different cities.
1901
+
The following code example creates a tool definition that is able to look for flight information from two different cities.
1902
1902
1903
1903
1904
1904
```json
@@ -2152,17 +2152,16 @@ For more examples of how to use Cohere models, see the following examples and tu
2152
2152
2153
2153
Quotaismanagedperdeployment. Eachdeploymenthasaratelimitof200,000tokensperminuteand1,000APIrequestsperminute. However, wecurrentlylimitonedeploymentpermodelperproject. ContactMicrosoftAzureSupportifthecurrentratelimitsaren't sufficient for your scenarios.
2154
2154
2155
-
Cohere models deployed as a serverless API are offered by Cohere through the Azure Marketplace and integrated with Azure AI Studio for use. You can find the Azure Marketplace pricing when deploying the model.
2155
+
Cohere models deployed as a serverless API are offered by Cohere through the Azure Marketplace and integrated with Azure Machine Learning studio for use. You can find the Azure Marketplace pricing when deploying the model.
2156
2156
2157
2157
Each time a project subscribes to a given offer from the Azure Marketplace, a new resource is created to track the costs associated with its consumption. The same resource is used to track costs associated with inference; however, multiple meters are available to track each scenario independently.
2158
2158
2159
-
For more information on how to track costs, see [Monitor costs for models offered through the Azure Marketplace](costs-plan-manage.md#monitor-costs-for-models-offered-through-the-azure-marketplace).
2159
+
For more information on how to track costs, see [Monitor costs for models offered through the Azure Marketplace](/azure/ai-studio/how-to/costs-plan-manage#monitor-costs-for-models-offered-through-the-azure-marketplace).
2160
2160
2161
2161
## Related content
2162
2162
2163
2163
2164
-
* [Azure AI Model Inference API](../reference/reference-model-inference-api.md)
2165
-
* [Deploy models as serverless APIs](deploy-models-serverless.md)
2166
-
* [Consume serverless API endpoints from a different Azure AI Studio project or hub](deploy-models-serverless-connect.md)
2167
-
* [Region availability for models in serverless API endpoints](deploy-models-serverless-availability.md)
2168
-
* [Plan and manage costs (marketplace)](costs-plan-manage.md#monitor-costs-for-models-offered-through-the-azure-marketplace)
2164
+
* [Azure AI Model Inference API](reference-model-inference-api.md)
2165
+
* [Deploy models as serverless APIs](how-to-deploy-models-serverless.md)
2166
+
* [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md)
2167
+
* [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)
0 commit comments