You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/agents/concepts/model-region-support.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ Azure OpenAI provides customers with choices on the hosting structure that fits
22
22
-**Standard** is offered with a global deployment option, routing traffic globally to provide higher throughput.
23
23
-**Provisioned** is also offered with a global deployment option, allowing customers to purchase and deploy provisioned throughput units across Azure global infrastructure.
24
24
25
-
All deployments can perform the exact same inference operations, however the billing, scale, and performance are substantially different. To learn more about Azure OpenAI deployment types see our [deployment types guide](../../openai/how-to/deployment-types.md).
25
+
All deployments can perform the exact same inference operations, however the billing, scale, and performance are substantially different. To learn more about Azure OpenAI deployment types see [deployment types guide](../../openai/how-to/deployment-types.md).
26
26
27
27
Azure AI Agent Service supports the following Azure OpenAI models in the listed regions.
28
28
@@ -59,9 +59,9 @@ To use these models, you can use [Azure AI Foundry portal](https://ai.azure.com/
59
59
```python
60
60
agent = project_client.agents.create_agent( model="llama-3", name="my-agent", instructions="You are a helpful agent" )
61
61
```
62
-
## Azure AI Foundry Models
62
+
## Azure AI foundry models
63
63
64
-
### Models with Tool-Calling
64
+
### Models with tool-calling
65
65
66
66
To best support agentic scenarios, we recommend using models that support tool-calling. The Azure AI Agent Service currently supports all agent-compatible models from the Azure AI Foundry model catalog.
67
67
@@ -72,7 +72,7 @@ To use these models, use the [Azure AI Foundry portal](https://ai.azure.com/) to
72
72
> [!NOTE]
73
73
> This option should only be used for open-source models (for example, Cepstral, Mistral, Llama) and not for OpenAI models, which are natively supported in the service. This option should also only be used for models that support tool-calling.
74
74
75
-
### Models without Tool-Calling
75
+
### Models without tool-calling
76
76
77
77
Though tool-calling support is a core capability for agentic scenarios, we now provide the ability to use models that don’t support tool-calling in our API and SDK. This option can be helpful when you have specific use-cases that don’t require tool-calling.
78
78
@@ -86,7 +86,7 @@ The following steps will allow you to utilize any chat-completion model that is
86
86
87
87
1. Create a new Serverless connection on **Connected Resources** page, using the target URI and key.
88
88
89
-
Model can now be referenced in your code (`Target URI` + `@` + `Model Name`), for example:
89
+
The model can now be referenced in your code (`Target URI` + `@` + `Model Name`), for example:
0 commit comments