You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-deploy-models-cohere-embed.md
+14-4Lines changed: 14 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -53,7 +53,7 @@ The previously mentioned Cohere models can be deployed as a service with pay-as-
53
53
- An Azure Machine Learning workspace. If you don't have these, use the steps in the [Quickstart: Create workspace resources](quickstart-create-resources.md) article to create them.
54
54
55
55
> [!IMPORTANT]
56
-
> Pay-as-you-go model deployment offering is only available in workspaces created in EastUS, EastUS2 or Sweden Central regions.
56
+
> Pay-as-you-go model deployment offering is only available in workspaces created in EastUS2 or Sweden Central regions.
57
57
58
58
- Azure role-based access controls (Azure RBAC) are used to grant access to operations. To perform the steps in this article, your user account must be assigned the __Azure AI Developer role__ on the Resource Group.
59
59
@@ -64,7 +64,7 @@ The previously mentioned Cohere models can be deployed as a service with pay-as-
64
64
To create a deployment:
65
65
66
66
1. Go to [Azure Machine Learning studio](https://ml.azure.com/home).
67
-
1. Select the workspace in which you want to deploy your models. To use the pay-as-you-go model deployment offering, your workspace must belong to the EastUS, EastUS2 or Sweden Central region.
67
+
1. Select the workspace in which you want to deploy your models. To use the pay-as-you-go model deployment offering, your workspace must belong to the EastUS2 or Sweden Central region.
68
68
1. Choose the model you want to deploy from the [model catalog](https://ml.azure.com/model/catalog).
69
69
70
70
Alternatively, you can initiate deployment by going to your workspace and selecting **Endpoints** > **Serverless endpoints** > **Create**.
@@ -118,15 +118,15 @@ For more information on using the APIs, see the [reference](#embed-api-reference
118
118
Content-type: application/json
119
119
```
120
120
121
-
#### v1/emebeddings request schema
121
+
#### v1/embeddings request schema
122
122
123
123
Cohere Embed v3 - English and Embed v3 - Multilingual accept the following parameters for a `v1/embeddings` API call:
124
124
125
125
| Property | Type | Default | Description |
126
126
| --- | --- | --- | --- |
127
127
|`input`|`array of strings`|Required |An array of strings for the model to embed. Maximum number of texts per call is 96. We recommend reducing the length of each text to be under 512 tokens for optimal quality. |
128
128
129
-
#### v1/emebeddings response schema
129
+
#### v1/embeddings response schema
130
130
131
131
The response payload is a dictionary with the following fields:
Create a local (FAISS) vector index using Cohere embeddings - Langchain|`langchain`, `langchain_cohere`|[cohere_faiss_langchain_embed.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/cohere_faiss_langchain_embed.ipynb)
338
+
Use Cohere Command R/R+ to answer questions from data in local (FAISS) vector index - Langchain|`langchain`, `langchain_cohere`|[command_faiss_langchain.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/command_faiss_langchain.ipynb)
339
+
Use Cohere Command R/R+ to answer questions from data in AI search vector index - Langchain|`langchain`, `langchain_cohere`|[cohere-aisearch-langchain-rag.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/cohere-aisearch-langchain-rag.ipynb)
340
+
Use Cohere Command R/R+ to answer questions from data in AI search vector index - Cohere SDK| `cohere`, `azure_search_documents`|[cohere-aisearch-rag.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/cohere-aisearch-rag.ipynb)
341
+
Command R+ tool/function calling using LangChain|`cohere`, `langchain`, `langchain_cohere`|[command_tools-langchain.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/command_tools-langchain.ipynb)
0 commit comments