You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: sdk/ai/azure-ai-projects/CHANGELOG.md
+19Lines changed: 19 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,24 @@
1
1
# Release History
2
2
3
+
## 1.0.0b5 (2025-01-17)
4
+
5
+
### Features added
6
+
7
+
* Add method `.inference.get_image_embeddings_client` on `AIProjectClient` to get an authenticated
8
+
`ImageEmbeddingsClient` (from the package azure-ai-inference). You need to have azure-ai-inference package
9
+
version 1.0.0b7 or above installed for this method to work.
10
+
11
+
### Bugs Fixed
12
+
13
+
* Fix for events dropped in streamed Agent response (see [GitHub issue 39028](https://github.com/Azure/azure-sdk-for-python/issues/39028)).
14
+
* In Agents, incomplete status thread run event is now deserialized into a ThreadRun object, during stream iteration, and invokes the correct function `on_thread_run` (instead of the wrong function `on_unhandled_event`).
15
+
* Fix an error when calling the `to_evaluator_model_config` method of class `ConnectionProperties`. See new input
16
+
argument `include_credentials`.
17
+
18
+
### Breaking Changes
19
+
20
+
*`submit_tool_outputs_to_run` returns `None` instead of `ThreadRun` (see [GitHub issue 39028](https://github.com/Azure/azure-sdk-for-python/issues/39028)).
Copy file name to clipboardExpand all lines: sdk/ai/azure-ai-projects/README.md
+13-4Lines changed: 13 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -187,7 +187,7 @@ print(connection)
187
187
188
188
### Get an authenticated ChatCompletionsClient
189
189
190
-
Your Azure AI Foundry project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an already authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient?view=azure-python-preview) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call.
190
+
Your Azure AI Foundry project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an already authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call.
See the "inference" folder in the [package samples][samples] for additional samples, including getting an authenticated [EmbeddingsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.embeddingsclient?view=azure-python-preview).
211
+
See the "inference" folder in the [package samples][samples] for additional samples, including getting an authenticated [EmbeddingsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.embeddingsclient) and [ImageEmbeddingsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.imageembeddingsclient).
212
212
213
213
### Get an authenticated AzureOpenAI client
214
214
@@ -844,6 +844,15 @@ with project_client.agents.create_stream(
844
844
845
845
As you can see, this SDK parses the events and produces various event types similar to OpenAI assistants. In your use case, you might not be interested in handling all these types and may decide to parse the events on your own. To achieve this, please refer to [override base event handler](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/agents/sample_agents_stream_with_base_override_eventhandler.py).
846
846
847
+
```
848
+
Note: Multiple streaming processes may be chained behind the scenes.
849
+
850
+
When the SDK receives a `ThreadRun` event with the status `requires_action`, the next event will be `Done`, followed by termination. The SDK will submit the tool calls using the same event handler. The event handler will then chain the main stream with the tool stream.
851
+
852
+
Consequently, when you iterate over the streaming using a for loop similar to the example above, the for loop will receive events from the main stream followed by events from the tool stream.
853
+
```
854
+
855
+
847
856
#### Retrieve Message
848
857
849
858
To retrieve messages from agents, use the following example:
@@ -1070,7 +1079,7 @@ Make sure to install OpenTelemetry and the Azure SDK tracing plugin via
You will also need an exporter to send telemetry to your observability backend. You can print traces to the console or use a local viewer such as [Aspire Dashboard](https://learn.microsoft.com/dotnet/aspire/fundamentals/dashboard/standalone?tabs=bash).
@@ -1218,5 +1227,5 @@ additional questions or comments.
0 commit comments