|
1 | 1 | --- |
2 | 2 | title: OpenAI agent |
3 | 3 | description: Learn how to configure the OpenAI agent. |
4 | | -ms.date: 08/25/2025 |
| 4 | +ms.date: 09/09/2025 |
5 | 5 | ms.topic: how-to |
6 | 6 | --- |
7 | 7 | # OpenAI agent |
@@ -134,42 +134,98 @@ Azure OpenAI uses the following hierarchy of credentials for authentication: |
134 | 134 | - `InteractiveBrowserCredential` |
135 | 135 |
|
136 | 136 | For more information about these credentials, see .NET documentation for |
137 | | -[`DefaultAzureCredential`][07]. |
| 137 | +[`DefaultAzureCredential`][09]. |
138 | 138 |
|
139 | 139 | ## Support for other OpenAI-compatible models |
140 | 140 |
|
141 | 141 | The OpenAI agent supports third-party AI services that implement the OpenAI API specifications. Some |
142 | 142 | of these models are open source tools for running SLMs and LLMs locally. The OpenAI agent supports |
143 | 143 | the following 3rd-party models: |
144 | 144 |
|
145 | | -- [**Ollama**][06] |
146 | | -- [**LM Studio**][04] |
147 | | -- [**Deepseek**][02] |
148 | | -- [**LocalAI**][05] |
149 | | -- [**Google Gemini**][01] |
150 | | -- [**Grok**][03] |
| 145 | +- [**Ollama**][08] |
| 146 | +- [**LM Studio**][06] |
| 147 | +- [**Deepseek**][04] |
| 148 | +- [**LocalAI**][07] |
| 149 | +- [**Google Gemini**][03] |
| 150 | +- [**Grok**][05] |
| 151 | +- [**Foundry Local**][02] |
151 | 152 |
|
152 | | -For more information about endpoints and model names, see the 3rd-party documentation for the AI |
153 | | -service you want to use. |
| 153 | +Foundry Local is an on-device AI inference solution from Microsoft, currently in public preview. AI |
| 154 | +Shell interfaces with it using the OpenAI agent. You must install and configure Foundry Local on |
| 155 | +your machine before you can use it with AI Shell. For more information, see |
| 156 | +[Get started with Foundry Local][01]. |
154 | 157 |
|
155 | 158 | The OpenAI agent supports the following model names: |
156 | 159 |
|
157 | 160 | - `o1` |
158 | 161 | - `o3` |
159 | 162 | - `o4-mini` |
| 163 | +- `gpt-5` |
160 | 164 | - `gpt-4.1` |
161 | 165 | - `gpt-4o` |
162 | 166 | - `gpt-4` |
163 | 167 | - `gpt-4-32k` |
164 | 168 | - `gpt-4-turbo` |
165 | 169 | - `gpt-3.5-turbo` |
166 | 170 | - `gpt-35-turbo` - Azure OpenAI name of the model |
| 171 | +- Any of the model IDs supported by Foundry Local |
| 172 | + |
| 173 | +For more information about endpoints and model names, see the 3rd-party documentation for the AI |
| 174 | +service you want to use. |
| 175 | + |
| 176 | +### Configure a Foundry Local endpoint |
| 177 | + |
| 178 | +After you have Foundry Local installed, run the following commands to get the information you need |
| 179 | +to configure the OpenAI agent: |
| 180 | + |
| 181 | +```powershell |
| 182 | +PS> foundry service start |
| 183 | +🟢 Service is already running on http://127.0.0.1:56952/. |
| 184 | +
|
| 185 | +PS> foundry model load phi-3.5-mini |
| 186 | +🕔 Loading model... |
| 187 | +🟢 Model phi-3.5-mini loaded successfully |
| 188 | +
|
| 189 | +PS> foundry service ps |
| 190 | +Models running in service: |
| 191 | + Alias Model ID |
| 192 | +🟢 phi-3.5-mini Phi-3.5-mini-instruct-generic-cpu |
| 193 | +``` |
| 194 | + |
| 195 | +This example starts the Foundry Local service, loads the `phi-3.5-mini` model, and lists the models |
| 196 | +in the running in the service. |
| 197 | + |
| 198 | +Next, add a new GPT to your `openai.agent.json` file. |
| 199 | + |
| 200 | +- The `foundry service start` shows the URI for the service. The `Endpoint` for the OpenAI agent is |
| 201 | + the URI plus `/v1`. |
| 202 | +- The `foundry service ps` command shows `ModelName` as the **Model ID**. Make sure you use the |
| 203 | + exact casing as shown in **Model ID**. Foundry Local is case-sensitive. |
| 204 | +- The API key is hardcoded to `OPENAI_API_KEY`. |
| 205 | + |
| 206 | +```json |
| 207 | +{ |
| 208 | + "GPTs": [ |
| 209 | + { |
| 210 | + "Name": "foundry-local", |
| 211 | + "Description": "A GPT instance using Foundry Local.", |
| 212 | + "Endpoint": "http://127.0.0.1:56952/v1", |
| 213 | + "ModelName": "Phi-3.5-mini-instruct-generic-cpu", |
| 214 | + "Key": "OPENAI_API_KEY" |
| 215 | + } |
| 216 | + ] |
| 217 | + |
| 218 | + "Active": "foundry-local" |
| 219 | +} |
| 220 | +``` |
167 | 221 |
|
168 | 222 | <!-- link references --> |
169 | | -[01]: https://ai.google.dev/gemini-api/docs/openai |
170 | | -[02]: https://api-docs.deepseek.com/ |
171 | | -[03]: https://docs.x.ai/docs/overview#migrating-from-another-llm-provider |
172 | | -[04]: https://lmstudio.ai/docs/api/openai-api |
173 | | -[05]: https://localai.io/ |
174 | | -[06]: https://ollama.com/blog/openai-compatibility |
175 | | -[07]: xref:Azure.Identity.DefaultAzureCredential |
| 223 | +[01]: /azure/ai-foundry/foundry-local/get-started |
| 224 | +[02]: /azure/ai-foundry/foundry-local/what-is-foundry-local |
| 225 | +[03]: https://ai.google.dev/gemini-api/docs/openai |
| 226 | +[04]: https://api-docs.deepseek.com/ |
| 227 | +[05]: https://docs.x.ai/docs/overview#migrating-from-another-llm-provider |
| 228 | +[06]: https://lmstudio.ai/docs/api/openai-api |
| 229 | +[07]: https://localai.io/ |
| 230 | +[08]: https://ollama.com/blog/openai-compatibility |
| 231 | +[09]: xref:Azure.Identity.DefaultAzureCredential |
0 commit comments