You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# customer intent: I want to learn how to use the Azure AI Foundry SDK to build AI applications on Azure.
15
17
---
16
18
17
19
# The Azure AI Foundry SDK
18
20
19
21
The Azure AI Foundry SDK is a comprehensive toolchain designed to simplify the development of AI applications on Azure. It enables developers to:
20
22
21
-
- Access popular models from various model providers through a single interface
22
-
- Easily combine together models, data, and AI services to build AI-powered applications
23
-
- Evaluate, debug, and improve application quality & safety across development, testing, and production environments
23
+
- Access popular models from various model providers through a single interface
24
+
- Easily combine together models, data, and AI services to build AI-powered applications
25
+
- Evaluate, debug, and improve application quality & safety across development, testing, and production environments
24
26
25
27
The AI Foundry SDK is a set of packages and services designed to work together. You can use the Azure AI Projects client library to easily use multiple services through a single project client and connection string. You can also use services and SDKs on their own and connect directly to your services.
26
28
27
29
If you want to jump right in and start building an app, check out:
28
-
-[Create a chat app](../../quickstarts/get-started-code.md)
29
-
-[Create a custom RAG app](../../tutorials/copilot-sdk-create-resources.md)
30
+
31
+
-[Create a chat app](../../quickstarts/get-started-code.md)
32
+
-[Create a custom RAG app](../../tutorials/copilot-sdk-create-resources.md)
30
33
31
34
## Get started with Projects
32
35
@@ -36,13 +39,15 @@ First follow steps to [create an AI Project](../create-projects.md) if you don't
36
39
37
40
Sign in with the Azure CLI using the same account that you use to access your AI Project:
Once you have created the project client, you can use the client for the capabilities in the following sections.
108
+
Not yet available in C#.
109
+
110
+
::: zone-end
111
+
112
+
Copy the **Project connection string** from the **Overview** page of the project and update the connections string value above.
113
+
114
+
Once you have created the project client, you can use the client for the capabilities in the following sections.
115
+
116
+
::: zone pivot="programming-language-python"
82
117
83
118
Be sure to check out the [reference](https://aka.ms/aifoundrysdk/reference) and [samples](https://aka.ms/azsdk/azure-ai-projects/python/samples).
84
119
120
+
::: zone-end
121
+
122
+
::: zone pivot="programming-language-csharp"
123
+
124
+
Be sure to check out the [reference](https://aka.ms/aifoundrysdk/reference) and [samples](https://aka.ms/azsdk/azure-ai-projects/csharp/samples).
125
+
126
+
::: zone-end
127
+
85
128
## Azure OpenAI Service
86
129
87
130
The [Azure OpenAI Service](../../../ai-services/openai/overview.md) provides access to OpenAI's models including the GPT-4o, GPT-4o mini, GPT-4, GPT-4 Turbo with Vision, DALLE-3, Whisper, and Embeddings model series with the data residency, scalability, safety, security and enterprise capabilities of Azure.
88
131
89
132
If you have code that uses the OpenAI SDK, you can easily target your code to use the Azure OpenAI service. First, install the OpenAI SDK:
90
-
```
133
+
134
+
::: zone pivot="programming-language-python"
135
+
136
+
```bash
91
137
pip install openai
92
138
```
93
139
94
-
If you have existing code that uses the OpenAI SDK, you can use the project client to create an ```AzureOpenAI``` client that uses your project's Azure OpenAI connection:
140
+
If you have existing code that uses the OpenAI SDK, you can use the project client to create an `AzureOpenAI` client that uses your project's Azure OpenAI connection:
If you’re already using the [Azure OpenAI SDK](../../../ai-services/openai/chatgpt-quickstart.md) directly against the Azure OpenAI Service, the project provides a convenient way to use Azure OpenAI Service capabilities alongside the rest of the AI Foundry capabilities.
108
154
155
+
::: zone-end
156
+
157
+
::: zone pivot="programming-language-csharp"
158
+
159
+
```dotnet
160
+
dotnet add package Azure.AI.OpenAI
161
+
```
162
+
163
+
Add using statements:
164
+
165
+
```csharp
166
+
usingOpenAI.Chat;
167
+
usingAzure.AI.OpenAI;
168
+
```
169
+
170
+
If you have existing code that uses the OpenAI SDK, you can use the project client to create an `AzureOpenAI` client that uses your project's Azure OpenAI connection:
If you’re already using the [Azure OpenAI SDK](../../../ai-services/openai/chatgpt-quickstart.md) directly against the Azure OpenAI Service, the project provides a convenient way to use Azure OpenAI Service capabilities alongside the rest of the AI Foundry capabilities.
109
177
110
178
## Azure AI model inference service
111
179
112
180
The [Azure AI model inference service](/azure/ai-studio/ai-services/model-inference) offers access to powerful models from leading providers like OpenAI, Microsoft, Meta, and more. These models support tasks such as content generation, summarization, and code generation.
113
181
114
182
To use the model inference service, first ensure that your project has an AI Services connection (in the management center).
115
183
116
-
Install the ```azure-ai-inferencing``` client library:
184
+
Install the `azure-ai-inferencing` client library:
117
185
118
-
```
186
+
::: zone pivot="programming-language-python"
187
+
188
+
```bash
119
189
pip install azure-ai-inference
120
190
```
121
191
122
-
You can use the project client to get a configured and authenticated ```ChatCompletionsClient``` or ```EmbeddingsClient```:
192
+
You can use the project client to get a configured and authenticated `ChatCompletionsClient` or `EmbeddingsClient`:
123
193
124
194
```Python
125
195
# get an chat inferencing client using the project's default model inferencing endpoint
@@ -137,17 +207,39 @@ response = chat.complete(
137
207
print(response.choices[0].message.content)
138
208
```
139
209
210
+
::: zone-end
211
+
212
+
::: zone pivot="programming-language-csharp"
213
+
214
+
```dotnet
215
+
dotnet add package Azure.AI.Inference
216
+
```
217
+
218
+
Add using statements:
219
+
220
+
```csharp
221
+
usingAzure.AI.Inference;
222
+
```
223
+
224
+
You can use the project client to get a configured and authenticated `ChatCompletionsClient` or `EmbeddingsClient`:
You can change the model name to any model that you deployed to the inference service or Azure OpenAI service.
141
231
142
232
To learn more about using the Azure AI inferencing client, check out the [Azure AI model inferencing reference](/azure/ai-studio/reference/reference-model-inference-api).
143
233
234
+
::: zone pivot="programming-language-python"
235
+
144
236
## Prompt Templates
145
237
146
238
The inferencing client supports for creating prompt messages from templates. The template allows you to dynamically generate prompts using inputs that are available at runtime.
147
239
148
240
To use prompt templates, install the `azure-ai-inferencing` package:
149
241
150
-
```
242
+
```bash
151
243
pip install azure-ai-inference
152
244
```
153
245
@@ -198,13 +290,17 @@ response = chat.complete(
198
290
)
199
291
```
200
292
293
+
::: zone-end
294
+
201
295
## Azure AI Search
202
296
203
-
If you have an Azure AI Search resource connected to your project, you can also use the project client to create an Azure AI Search client using the project connection.
297
+
If you have an Azure AI Search resource connected to your project, you can also use the project client to create an Azure AI Search client using the project connection.
To learn more about using Azure AI Search, check out [Azure AI Search documentation](/azure/search/).
239
356
240
-
## Azure AI agents runtime
357
+
## Azure AI Agent Service
241
358
242
359
Azure AI Agent Service is a fully managed service designed to empower developers to securely build, deploy, and scale high-quality, and extensible AI agents. Using an extensive ecosystem of models, tools and capabilities from OpenAI, Microsoft, and third-party providers, Azure AI Agent Service enables building agents for a wide range of generative AI use cases.
243
360
244
-
To get access to agents, [sign-up for the private preview](https://nam.dcv.ms/nzy5CEG6Br).
361
+
To get access to agents, [sign-up for the preview](https://nam.dcv.ms/nzy5CEG6Br).
245
362
246
363
## Evaluation
247
364
365
+
::: zone pivot="programming-language-python"
366
+
248
367
You can use the project client to easily connect to the Azure AI evaluation service, and models needed for running your evaluators.
368
+
369
+
249
370
```
250
371
pip install azure-ai-evaluation
251
372
```
252
373
253
-
Using the ```project.scope``` parameter, we can instantiate a ```ViolenceEvaluator```:
374
+
Using the `project.scope` parameter, we can instantiate a `ViolenceEvaluator`:
violence_score = violence_eval(query="what's the capital of france", response="Paris")
265
387
print(violence_score)
266
388
```
389
+
267
390
NOTE: to run violence evaluators your project needs to be in East US 2, Sweden Central, US North Central, France Central.
268
391
269
392
To learn more, check out [Evaluation using the SDK](evaluate-sdk.md).
270
393
394
+
::: zone-end
395
+
396
+
::: zone pivot="programming-language-csharp"
397
+
398
+
An Azure AI evaluation package is not yet available for C#. For a sample on how to use Prompty and Semantic Kernel for evaluation, see the [contoso-chat-csharp-prompty](https://github.com/Azure-Samples/contoso-chat-csharp-prompty/blob/main/src/ContosoChatAPI/ContosoChat.Evaluation.Tests/Evalutate.cs) sample.
399
+
400
+
::: zone-end
401
+
402
+
271
403
## Tracing
272
404
405
+
::: zone pivot="programming-language-python"
406
+
273
407
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project and follow instructions to create or attach Application Insights.
274
408
275
409
Install the Azure Monitor OpenTelemetry package:
410
+
276
411
```
277
412
pip install azure-monitor-opentelemetry
278
413
```
279
414
280
415
Use the following code to enable instrumentation of the Azure AI Inference SDK and logging to your AI project:
416
+
281
417
```Python
282
418
# Enable instrumentation of AI packages (inference, agents, openai, langchain)
283
419
project.telemetry.enable()
@@ -288,6 +424,14 @@ if application_insights_connection_string:
Tracing is not yet integrated into the projects package. For instructions on how to instrument and log traces from the Azure AI Inferencing package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md.).
432
+
433
+
::: zone-end
434
+
291
435
## Related content
292
436
293
437
Below are some helpful links to other services and frameworks that you can use with the Azure AI Foundry SDK.
0 commit comments