You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Microsoft offers a variety of packages that you can use for building generative AI applications in the cloud. In most applications, you need to use a combination of packages to manage and use various Azure services that provide AI functionality. We also offer integrations with open-source libraries like LangChain and MLflow for use with Azure. In this article we'll give an overview of the main services and SDKs you can use with Azure AI Studio.
18
+
The Azure AI Foundry SDK is a comprehensive toolchain designed to simplify the development of AI applications on Azure. It enables developers to:
19
19
20
-
For building generative AI applications, we recommend using the following services and SDKs:
21
-
*[Azure Machine Learning](/azure/machine-learning/overview-what-is-azure-machine-learning) for the hub and project infrastructure used in AI Studio to organize your work into projects, manage project artifacts (data, evaluation runs, traces), fine-tune & deploy models, and connect to external services and resources.
22
-
*[Azure AI services](../../../ai-services/what-are-ai-services.md) provides pre-built and customizable intelligent APIs and models, with support for Azure OpenAI, Azure AI Search, Speech, Vision, and Language.
23
-
*[Prompt flow](https://microsoft.github.io/promptflow/index.html) for developer tools to streamline the end-to-end development cycle of LLM-based AI application, with support for inferencing, indexing, evaluation, deployment, and monitoring.
20
+
- Access popular models from various model providers through a single interface
21
+
- Easily combine together models, data, and AI services to build AI-powered applications
22
+
- Evaluate, debug, and improve application quality & safety across development, testing, and production environments
23
+
24
+
The AI Foundry SDK is a set of packages and services designed to work together. You can use the Azure AI Projects client library to easily use multiple services through a single project client and connection string. You can also use services and SDKs on their own and connect directly to your services.
24
25
25
-
For each of these, there are separate sets of management libraries and client libraries.
26
+
If you want to jump right in and start building an app, check out:
27
+
-[Create a chat app](../../quickstarts/get-started-code.md)
28
+
-[Create a custom RAG app](../../tutorials/copilot-sdk-create-resources.md)
26
29
27
-
## Management libraries for creating and managing cloud resources
30
+
## Get started with Projects
28
31
29
-
Azure [management libraries](/azure/developer/python/sdk/azure-sdk-overview#create-and-manage-azure-resources-with-management-libraries) (also "control plane" or "management plane"), for creating and managing cloud resources that are used by your application.
32
+
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string.
Copy the **Project connection string** from the **Overview** page of the project and update the `project_connection_string` variable above.
79
+
80
+
Once you have created the project client, you can use the client for the capabilities in the following sections.
81
+
82
+
Be sure to check out the [reference]() and [samples](https://aka.ms/azsdk/azure-ai-projects/python/samples).
83
+
84
+
## Azure OpenAI Service
85
+
86
+
If you have existing code that uses the OpenAI SDK, you can use the project client to create an ```AzureOpenAI``` client that uses your project's Azure OpenAI connection.
87
+
```
88
+
pip install openai
89
+
```
90
+
91
+
Now use the project client to return an ```AzureOpenAI``` client with the desired API version and make a chat completions call.
{"role": "system", "content": "You are a helpful writing assistant"},
99
+
{"role": "user", "content": "Write me a poem about flowers"},
100
+
]
101
+
)
102
+
103
+
print(response.choices[0].message.content)
104
+
```
105
+
106
+
For more on using the Azure OpenAI client library, including how to use it directly against with the Azure OpenAI Service, check out [Azure OpenAI chat quickstart](../../../ai-services/openai/chatgpt-quickstart.md).
107
+
108
+
## Azure AI model inference service
109
+
110
+
The [Azure AI model inference service](/azure/ai-studio/ai-services/model-inference) offers access to powerful models from leading providers like OpenAI, Microsoft, Meta, and more. These models support tasks such as content generation, summarization, and code generation.
111
+
112
+
To use the model inference service, first ensure that your project has an AI Services connection (in the management center).
113
+
114
+
Install the ```azure-ai-inferencing``` client library:
115
+
116
+
```
117
+
pip install azure-ai-inference
118
+
```
119
+
120
+
You can use the project client to get a configured and authenticated ```ChatCompletionsClient``` or ```EmbeddingsClient```:
121
+
122
+
```Python
123
+
# get an chat inferencing client using the project's default model inferencing endpoint
# run a chat completion using the inferencing client
127
+
response = chat.complete(
128
+
model="gpt-4o",
129
+
messages=[
130
+
{"role": "system", "content": "You are a helpful writing assistant"},
131
+
{"role": "user", "content": "Write me a poem about flowers"},
132
+
]
133
+
)
134
+
135
+
print(response.choices[0].message.content)
136
+
```
137
+
138
+
You can change the model name to any model that you deployed to the inference service or Azure OpenAI service.
139
+
140
+
To learn more about using the Azure AI inferencing client, check out the [Azure AI model inferencing reference](/azure/ai-studio/reference/reference-model-inference-api).
141
+
142
+
## Prompt Templates
143
+
144
+
The inferencing client supports for creating prompt messages from templates. The template allows you to dynamically generate prompts using inputs that are available at runtime.
145
+
146
+
To use prompt templates, install the `azure-ai-inferencing` package:
147
+
148
+
```
149
+
pip install azure-ai-inference
150
+
```
151
+
152
+
You can render a prompt template from an inline string:
153
+
154
+
```Python
155
+
from azure.ai.inference.prompts import PromptTemplate
156
+
157
+
# create a prompt template from an inline string (using mustache syntax)
## Client libraries used in runtime application code
172
+
This code outputs messages that you can then pass to a chat completions call:
47
173
48
-
Azure [Client libraries](/azure/developer/python/sdk/azure-sdk-overview#connect-to-and-use-azure-resources-with-client-libraries) (also called "data plane") for connecting to and using provisioned services from runtime application code.
174
+
```text
175
+
[
176
+
{'role': 'system', 'content': "You are a helpful writing assistant.\nThe user's first name is Jane and their last name is Doe."}
177
+
{'role': 'user', 'content': 'Write me a poem about flowers'}
178
+
]
179
+
```
180
+
181
+
NOTE: leading whitespace is automatically trimmed from input strings.
182
+
183
+
You can also load prompts from a [`Prompty`](https://prompty.ai) file, enabling you to also load the model name and parameters from the `.prompty` file:
184
+
185
+
```Python
186
+
from azure.ai.inference.prompts import PromptTemplate
If you have an Azure AI Search resource connected to your project, you can also use the project client to create an Azure AI Search client using the project connection.
201
+
202
+
Install the Azure AI Search client library:
203
+
204
+
```
205
+
pip install azure-search-documents
206
+
```
207
+
208
+
Instantiate the search and/or search index client as desired:
209
+
210
+
```Python
211
+
from azure.core.credentials import AzureKeyCredential
212
+
from azure.ai.projects.models import ConnectionType
213
+
from azure.search.documents import SearchClient
214
+
from azure.search.documents.indexes import SearchIndexClient
215
+
216
+
# use the project client to get the default search connection
To learn more about using Azure AI Search, check out [Azure AI Search documentation](/azure/search/).
236
+
237
+
## Azure AI agents runtime
238
+
239
+
Azure AI Agent Service is a fully managed service designed to empower developers to securely build, deploy, and scale high-quality, and extensible AI agents. Using an extensive ecosystem of models, tools and capabilities from OpenAI, Microsoft, and third-party providers, Azure AI Agent Service enables building agents for a wide range of generative AI use cases.
240
+
241
+
To get access to agents, [sign-up for the private preview]().
242
+
243
+
## Evaluation
244
+
245
+
You can use the project client to easily connect to the Azure AI evaluation service, and models needed for running your evaluators.
246
+
247
+
Using the ```project.scope``` parameter, we can easily instantiate a ```ViolenceEvaluator```:
248
+
```Python
249
+
from azure.ai.evaluation import ViolenceEvaluator
250
+
from azure.identity import DefaultAzureCredential
251
+
252
+
# Initializing Violence Evaluator with project information
253
+
violence_eval = ViolenceEvaluator(
254
+
azure_ai_project=project.scope,
255
+
credential=DefaultAzureCredential())
256
+
257
+
# Running Violence Evaluator on single input row
258
+
violence_score = violence_eval(query="what's the capital of france", response="Paris")
259
+
print(violence_score)
260
+
```
261
+
262
+
To learn more, check out [Evaluation using the SDK](evaluate-sdk.md).
263
+
264
+
## Tracing
265
+
266
+
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project and follow instructions to create or attach Application Insights.
267
+
268
+
```Python
269
+
# Enable instrumentation of AI packages (inference, agents, openai, langchain)
270
+
project.telemetry.enable()
271
+
272
+
# Log traces to the project's application insights resource
0 commit comments