Skip to content

Commit c31c8fb

Browse files
Merge pull request #6017 from aahill/july-fixes
removing sections
2 parents 90b364e + 8d70bf1 commit c31c8fb

File tree

1 file changed

+1
-42
lines changed

1 file changed

+1
-42
lines changed

articles/ai-foundry/agents/concepts/model-region-support.md

Lines changed: 1 addition & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ author: aahill
77
ms.author: aahi
88
ms.service: azure-ai-agent-service
99
ms.topic: conceptual
10-
ms.date: 07/10/2025
10+
ms.date: 07/14/2025
1111
ms.custom: azure-ai-agents, references_regions
1212
---
1313

@@ -46,47 +46,6 @@ Azure AI Foundry Agent Service supports the following Azure OpenAI models in the
4646
| westus | X | X | X | X | X | | X | | X | |
4747
| westus3 | | X | X | X | X | | X | | | |
4848

49-
## Non-Microsoft models
50-
51-
The Azure AI Foundry Agent Service also supports the following models from the Azure AI Foundry model catalog.
52-
53-
* Meta-Llama-405B-Instruct
54-
55-
To use these models, you can use [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) to make a deployment, and then reference the deployment name in your agent. For example:
56-
57-
```python
58-
agent = project_client.agents.create_agent( model="llama-3", name="my-agent", instructions="You are a helpful agent" )
59-
```
60-
## Azure AI Foundry models
61-
62-
### Models with tool-calling
63-
64-
To best support agentic scenarios, we recommend using models that support tool-calling. The Azure AI Foundry Agent Service currently supports all agent-compatible models from the Azure AI Foundry model catalog.
65-
66-
To use these models, use the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) to make a model deployment, then reference the deployment name in your agent. For example:
67-
68-
`agent = project_client.agents.create_agent( model="llama-3", name="my-agent", instructions="You are a helpful agent")`
69-
70-
> [!NOTE]
71-
> This option should only be used for open-source models (for example, Cepstral, Mistral, Llama) and not for OpenAI models, which are natively supported in the service. This option should also only be used for models that support tool-calling.
72-
73-
### Models without tool-calling
74-
75-
Though tool-calling support is a core capability for agentic scenarios, we now provide the ability to use models that don’t support tool-calling in our API and SDK. This option can be helpful when you have specific use-cases that don’t require tool-calling.
76-
77-
The following steps will allow you to utilize any chat-completion model that is available through a [serverless API](/azure/ai-foundry/how-to/model-catalog-overview):
78-
79-
80-
81-
1. Deploy your desired model through serverless API. Model will show up on your **Models + Endpoints** page.
82-
83-
1. Click on model name to see model details, where you'll find your model's target URI and key.
84-
85-
1. Create a new Serverless connection on **Connected Resources** page, using the target URI and key.
86-
87-
The model can now be referenced in your code (`Target URI` + `@` + `Model Name`), for example:
88-
89-
`Model=https://Phi-4-mejco.eastus.models.ai.azure.com/@Phi-4-mejco`
9049

9150
## Next steps
9251

0 commit comments

Comments
 (0)