You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/agents/concepts/model-region-support.md
+45-2Lines changed: 45 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -54,14 +54,57 @@ Azure AI Agent Service supports the following Azure OpenAI models in the listed
54
54
The Azure AI Agent Service also supports the following models from the Azure AI Foundry model catalog.
55
55
56
56
* Meta-Llama-405B-Instruct
57
-
* Cohere-command-r-plus
58
-
* Cohere-command-r
59
57
60
58
To use these models, you can use [Azure AI Foundry portal](https://ai.azure.com/) to make a deployment, and then reference the deployment name in your agent. For example:
61
59
62
60
```python
63
61
agent = project_client.agents.create_agent( model="llama-3", name="my-agent", instructions="You are a helpful agent" )
64
62
```
63
+
## Azure AI Foundry Models
64
+
65
+
### Models with Tool-Calling
66
+
67
+
To best support agentic scenarios, we recommend using models that support tool-calling. The Azure AI Agent Service currently supports all agent-compatible models from the Azure AI Foundry model catalog.
68
+
69
+
To use these models, use the [Azure AI Foundry portal](https://ai.azure.com/) to make a model deployment, then reference the deployment name in your agent. For example:
70
+
71
+
`agent = project_client.agents.create_agent( model="llama-3", name="my-agent", instructions="You are a helpful agent")`
72
+
73
+
> [!NOTE]
74
+
> This option should only be used for open-source models (for example, Cepstral, Mistral, Llama) and not for OpenAI models, which are natively supported in the service. This option should also only be used for models that support tool-calling.
75
+
76
+
### Models without Tool-Calling
77
+
78
+
Though tool-calling support is a core capability for agentic scenarios, we now provide the ability to use models that don’t support tool-calling in our API and SDK. This option can be helpful when you have specific use-cases that don’t require tool-calling.
79
+
80
+
The following steps will allow you to utilize any chat-completion model that is available through a [serverless API](/ai-foundry/how-to/model-catalog-overview):
81
+
82
+
83
+
84
+
1. Deploy your desired model through serverless API. Model will show up on your ‘Models + Endpoints’ page.
85
+
86
+
1. Click on model name to see model details, where you'll find your model's target URI and key.
87
+
88
+
1. Create a new Serverless connection on **Connected Resources** page, using the target URI and key.
89
+
90
+
Model can now be referenced in your code (`Target URI` + `@` + `Model Name`), for example:
The Azure AI Agent Service also allows you to connect to any chat completion model endpoint that you provide from an external model provider (for example, Perplexity). This option can be used when the desired model is not in the AI Foundry Model Catalog.
97
+
98
+
The following steps will allow you to utilize any chat-completion model that you have access to:
99
+
100
+
1. Create a connection to your model provider's chat completion API. Click on the **Connected Resources** page, select **Serverless Connection**, then enter your model’s target URI and key (from your model provider's site).
101
+
102
+
The Model can now be referenced in your code (`Target URI` + `@` + `Model Name`), for example:
103
+
104
+
`Model=https://api.perplexity.ai@sonar`
105
+
106
+
> [!NOTE]
107
+
> Using a model that is not natively integrated with the Agents service may result in limited functionality (this will depend on the model you bring). For example, some models will provide all agentic features (for example, Llama Instruct models), but others will only provide basic model interaction via chat. To enjoy the full agentic experience – for example tool calling, and utilization of knowledge sources, using an agent-supporting model in the Foundry Model Catalog is recommended.
0 commit comments