You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: sdk/ai/azure-ai-projects/README.md
+75-36Lines changed: 75 additions & 36 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,16 +5,66 @@ Use the AI Projects client library (in preview) to:
5
5
6
6
***Enumerate connections** in your Azure AI Studio project and get connection properties.
7
7
For example, get the inference endpoint URL and credentials associated with your Azure OpenAI connection.
8
-
***Get an already-authenticated Inference client** for the default Azure OpenAI or AI Services connections in your Azure AI Studio project. Supports the AzureOpenAI client from the `openai` package, or clients from the `azure-ai-inference` package.
8
+
***Get an authenticated Inference client** to do chat completions, for the default Azure OpenAI or AI Services connections in your Azure AI Studio project. Supports the AzureOpenAI client from the `openai` package, or clients from the `azure-ai-inference` package.
9
9
***Develop Agents using the Azure AI Agent Service**, leveraging an extensive ecosystem of models, tools, and capabilities from OpenAI, Microsoft, and other LLM providers. The Azure AI Agent Service enables the building of Agents for a wide range of generative AI use cases. The package is currently in private preview.
10
-
***Run Evaluation tools** to assess the performance of generative AI applications using various evaluators and metrics. It includes built-in evaluators for quality, risk, and safety, and allows custom evaluators for specific needs.
10
+
***Run Evaluations** to assess the performance of generative AI applications using various evaluators and metrics. It includes built-in evaluators for quality, risk, and safety, and allows custom evaluators for specific needs.
@@ -25,8 +75,7 @@ For example, get the inference endpoint URL and credentials associated with your
25
75
- A [project in Azure AI Studio](https://learn.microsoft.com/azure/ai-studio/how-to/create-projects?tabs=ai-studio).
26
76
- The project connection string. It can be found in your Azure AI Studio project overview page, under "Project details". Below we will assume the environment variable `PROJECT_CONNECTION_STRING` was defined to hold this value.
27
77
- Entra ID is needed to authenticate the client. Your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/python/api/azure-core/azure.core.credentials.tokencredential) interface. Code samples here use [DefaultAzureCredential](https://learn.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential). To get that working, you will need:
28
-
* The role `Azure AI Developer` assigned to you. Role assigned can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal.
29
-
* The token must have the scope `https://management.azure.com/.default` or `https://ml.azure.com/.default`, depending on the set of client operation you will execute.
78
+
* The `Contributor` role. Role assigned can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal.
* You are logged into your Azure account by running `az login`.
32
81
* Note that if you have multiple Azure subscriptions, the subscription that contains your Azure AI Project resource must be your default subscription. Run `az account list --output table` to list all your subscription and see which one is the default. Run `az account set --subscription "Your Subscription ID or Name"` to change your default subscription.
You Azure AI Studio project has a "Management center". When you enter it, you will see a tab named "Connected resources" under your project. The `.connections` operations on the client allow you to enumerate the connections and get connection properties. Connection properties include the resource URL and authentication credentials, among other things.
130
+
Your Azure AI Studio project has a "Management center". When you enter it, you will see a tab named "Connected resources" under your project. The `.connections` operations on the client allow you to enumerate the connections and get connection properties. Connection properties include the resource URL and authentication credentials, among other things.
82
131
83
-
Below are code examples of some simple connection operations. Additional samples can be found under the "connetions" folder in the [package samples][samples].
132
+
Below are code examples of the connection operations. Full samples can be found under the "connetions" folder in the [package samples][samples].
84
133
85
134
#### Get properties of all connections
86
135
@@ -113,7 +162,7 @@ with its authentication credentials:
include_credentials=True, # Optional. Defaults to "False"
165
+
include_credentials=True, # Optional. Defaults to "False".
117
166
)
118
167
print(connection)
119
168
```
@@ -123,18 +172,21 @@ will be populated. Otherwise both will be `None`.
123
172
124
173
#### Get properties of a connection by its connection name
125
174
126
-
To get the connection properties of a connection with name`connection_name`:
175
+
To get the connection properties of a connection named`connection_name`:
127
176
128
177
```python
129
178
connection = project_client.connections.get(
130
-
connection_name=connection_name, include_credentials=True# Optional. Defaults to "False"
179
+
connection_name=connection_name,
180
+
include_credentials=True# Optional. Defaults to "False"
131
181
)
132
182
print(connection)
133
183
```
134
184
135
185
### Get an authenticated ChatCompletionsClient
136
186
137
-
Your Azure AI Studio project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an already authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient?view=azure-python-preview) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call. First, install the package:
187
+
Your Azure AI Studio project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an already authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient?view=azure-python-preview) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call.
188
+
189
+
First, install the package:
138
190
139
191
```bash
140
192
pip install azure-ai-inference
@@ -157,13 +209,15 @@ See the "inference" folder in the [package samples][samples] for additional samp
157
209
158
210
### Get an authenticated AzureOpenAI client
159
211
160
-
Your Azure AI Studio project may have one or more OpenAI models deployed that support chat completions. Use the code below to get an already authenticated [AzureOpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://pypi.org/project/openai/) package, and execute a chat completions call. First, install the package:
212
+
Your Azure AI Studio project may have one or more OpenAI models deployed that support chat completions. Use the code below to get an already authenticated [AzureOpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://pypi.org/project/openai/) package, and execute a chat completions call.
213
+
214
+
First, install the package:
161
215
162
216
```bash
163
217
pip install openai
164
218
```
165
219
166
-
Then run this code (replace "gpt-4o" with your model deployment name):
220
+
Then run the code below. Replace `gpt-4o` with your model deployment name, and update the `api_version` value with one found in the "Data plane - inference" row [in this table](https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs).
NOTE: For running evaluators locally refer to [Evaluate with the Azure AI Evaluation SDK][evaluators].
833
870
834
-
####Tracing
871
+
### Tracing
835
872
836
873
You can add an Application Insights Azure resource to your Azure AI Studio project. See the Tracing tab in your studio. If one was enabled, you can get the Application Insights connection string, configure your Agents, and observe the full execution path through Azure Monitor. Typically, you might want to start tracing before you create an Agent.
837
874
838
-
#####Installation
875
+
#### Installation
839
876
840
877
Make sure to install OpenTelemetry and the Azure SDK tracing plugin via
841
878
@@ -852,7 +889,7 @@ To connect to Aspire Dashboard or another OpenTelemetry compatible backend, inst
852
889
pip install opentelemetry-exporter-otlp
853
890
```
854
891
855
-
#####Tracing example
892
+
#### Tracing example
856
893
857
894
Here is a code sample to be included above `create_agent`:
858
895
@@ -960,6 +997,8 @@ To report issues with the client library, or request additional features, please
960
997
961
998
Have a look at the [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-projects/samples) folder, containing fully runnable Python code for synchronous and asynchronous clients.
962
999
1000
+
Explore the [AI Starter Template](https://aka.ms/azsdk/azure-ai-projects/python/ai-starter-template). This template creates an Azure AI Studio hub, project and connected resources including Azure OpenAI Service, AI Search and more. It also deploys a simple chat application to Azure Container Apps.
1001
+
963
1002
## Contributing
964
1003
965
1004
This project welcomes contributions and suggestions. Most contributions require
0 commit comments