You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Azure AI model catalog offers a large selection of models, from a wide range of providers. This article lists featured models available in AI Foundry.
18
+
The Azure AI model catalog offers a large selection of models, from a wide range of providers. This article lists featured models available in Azure AI Foundry.
You have various options for deploying these models. For some models, you need to host them on your infrastructure, as in the case of deployment via managed compute. For other models, you can host them on Microsoft's servers, as in the case of deployment via serverless APIs. See [Available models for supported deployment options](../how-to/model-catalog-overview.md#available-models-for-supported-deployment-options) for a list of models in the catalog that are available for deployment via managed compute or serverless API.
22
+
You have various options for deploying models from the model catalog. For some models, you need to host them on your infrastructure, as in the case of deployment via managed compute. For others, you can host them on Microsoft's servers, as in the case of deployment via serverless APIs. See [Available models for supported deployment options](../how-to/model-catalog-overview.md#available-models-for-supported-deployment-options) for a list of models in the catalog that are available for deployment via managed compute or serverless API.
23
23
24
-
When it comes to performing inferencing with the models, some of these models, such as [Nixtla's TimeGEN-1](#nixtla) and [Cohere rerank](#cohere-rerank), require you to use custom APIs from the model providers. Others that belong to the following model types are supported for inferencing using the [Azure AI model inference](../model-inference/overview.md):
24
+
To perform inferencing with the models, some models such as [Nixtla's TimeGEN-1](#nixtla) and [Cohere rerank](#cohere-rerank) require you to use custom APIs from the model providers. Others that belong to the following model types support inferencing using the [Azure AI model inference](../model-inference/overview.md):
| Web requests | Bash |[Command-R](https://aka.ms/samples/cohere-command-r/webrequests)[Command-R+](https://aka.ms/samples/cohere-command-r-plus/webrequests) <br> [cohere-embed.ipynb](https://aka.ms/samples/embed-v3/webrequests)|
95
+
| Azure AI Inference package for C# | C# |[Link](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Inference/samples)|
96
+
| Azure AI Inference package for JavaScript | JavaScript |[Link](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples)|
97
+
| Azure AI Inference package for Python | Python |[Link](https://aka.ms/azsdk/azure-ai-inference/python/samples)|
#### Retrieval Augmented Generation (RAG) and tool use samples
104
+
105
+
| Description | Packages | Sample |
106
+
|-------------|------------|-----------------|
107
+
| Create a local Facebook AI similarity search (FAISS) vector index, using Cohere embeddings - Langchain |`langchain`, `langchain_cohere`|[cohere_faiss_langchain_embed.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/cohere_faiss_langchain_embed.ipynb)|
108
+
| Use Cohere Command R/R+ to answer questions from data in local FAISS vector index - Langchain |`langchain`, `langchain_cohere`|[command_faiss_langchain.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/command_faiss_langchain.ipynb)|
109
+
| Use Cohere Command R/R+ to answer questions from data in AI search vector index - Langchain |`langchain`, `langchain_cohere`|[cohere-aisearch-langchain-rag.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/cohere-aisearch-langchain-rag.ipynb)|
110
+
| Use Cohere Command R/R+ to answer questions from data in AI search vector index - Cohere SDK |`cohere`, `azure_search_documents`|[cohere-aisearch-rag.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/cohere-aisearch-rag.ipynb)|
111
+
| Command R+ tool/function calling, using LangChain |`cohere`, `langchain`, `langchain_cohere`|[command_tools-langchain.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/cohere/command_tools-langchain.ipynb)|
0 commit comments