Skip to content

Commit f05032e

Browse files
Merge pull request #6366 from MicrosoftDocs/main
Auto Publish – main to live - 2025-08-04 17:12 UTC
2 parents c84ae83 + b41797f commit f05032e

15 files changed

+746
-93
lines changed

articles/ai-foundry/foundry-models/concepts/models.md

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.author: mopeakande
77
manager: scottpolly
88
reviewer: santiagxf
99
ms.reviewer: fasantia
10-
ms.date: 07/11/2025
10+
ms.date: 08/04/2025
1111
ms.service: azure-ai-model-inference
1212
ms.topic: how-to
1313
ms.custom:
@@ -58,6 +58,17 @@ Azure OpenAI in Azure AI Foundry Models offers a diverse set of models with diff
5858

5959
See [this model collection in Azure AI Foundry portal](https://ai.azure.com/explore/models?&selectedCollection=aoai).
6060

61+
### Black Forest Labs models sold directly by Azure
62+
63+
The Black Forest Labs collection of image generation models include FLUX.1 Kontext [pro] for in-context generation and editing and FLUX1.1 [pro] for text-to-image generation.
64+
65+
| Model | Type | Capabilities | Project type |
66+
| ------ | ---- | ------------ | ------------ |
67+
| [FLUX.1-Kontext-pro](https://ai.azure.com/explore/models/FLUX.1-Kontext-pro/version/1/registry/azureml-blackforestlabs) | Image generation | - **Input:** text and image (5000 tokens and 1 image) <br /> - **Output:** One Image <br /> - **Tool calling:** No <br /> - **Response formats**: Image (PNG and JPG) | Foundry, Hub-based |
68+
| [FLUX-1.1-pro](https://ai.azure.com/explore/models/FLUX-1.1-pro/version/1/registry/azureml-blackforestlabs) | Image generation | - **Input:** text (5000 tokens) <br /> - **Output:** One Image <br /> - **Tool calling:** No <br /> - **Response formats:** Image (PNG and JPG) | Hub-based |
69+
70+
71+
See [this model collection in Azure AI Foundry portal](https://ai.azure.com/explore/models?&selectedCollection=Black+Forest+Labs).
6172

6273
### DeepSeek models sold directly by Azure
6374

articles/ai-foundry/how-to/continuous-evaluation-agents.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -45,8 +45,8 @@ import os, json
4545
from azure.ai.projects import AIProjectClient
4646
from azure.identity import DefaultAzureCredential
4747

48-
project_client = AIProjectClient.from_connection_string(
49-
credential=DefaultAzureCredential(), conn_str=os.environ["PROJECT_CONNECTION_STRING"]
48+
project_client = AIProjectClient(
49+
credential=DefaultAzureCredential(), endpoint=os.environ["PROJECT_ENDPOINT"]
5050
)
5151

5252
agent = project_client.agents.create_agent(
@@ -95,7 +95,7 @@ project_client.evaluation.create_agent_evaluation(
9595
thread=thread.id,
9696
run=run.id,
9797
evaluators=evaluators,
98-
appInsightsConnectionString = project_client.telemetry.get_connection_string(),
98+
appInsightsConnectionString = project_client.telemetry.get_application_insights_connection_string(),
9999
)
100100
)
101101

@@ -189,7 +189,7 @@ project_client.evaluation.create_agent_evaluation(
189189
run=run.id,
190190
evaluators=evaluators,
191191
samplingConfiguration = sampling_config,
192-
appInsightsConnectionString = project_client.telemetry.get_connection_string(),
192+
appInsightsConnectionString = project_client.telemetry.get_application_insights_connection_string(),
193193
)
194194
)
195195
```

articles/ai-foundry/how-to/develop/langchain.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -261,24 +261,24 @@ You can configure your application to send telemetry to Azure Application Insigh
261261
application_insights_connection_string = "instrumentation...."
262262
```
263263

264-
2. Using the Azure AI Foundry SDK and the project connection string (**[!INCLUDE [hub-project-name](../../includes/hub-project-name.md)]s only**).
264+
2. Using the Azure AI Foundry SDK and the Foundry Project endpoint:
265265

266266
1. Ensure you have the package `azure-ai-projects` installed in your environment.
267267

268268
2. Go to [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
269269

270-
3. Copy your project's connection string and set it the following code:
270+
3. Copy your Azure AI Foundry project endpoint URL and set it in the following code:
271271

272272
```python
273273
from azure.ai.projects import AIProjectClient
274274
from azure.identity import DefaultAzureCredential
275275
276-
project_client = AIProjectClient.from_connection_string(
276+
project_client = AIProjectClient(
277277
credential=DefaultAzureCredential(),
278-
conn_str="<your-project-connection-string>",
278+
endpoint="<your-foundry-project-endpoint-url>",
279279
)
280280
281-
application_insights_connection_string = project_client.telemetry.get_connection_string()
281+
application_insights_connection_string = project_client.telemetry.get_application_insights_connection_string()
282282
```
283283

284284
### Configure tracing for Azure AI Foundry

articles/ai-foundry/how-to/develop/trace-agents-sdk.md

Lines changed: 16 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ Let's begin instrumenting our agent with OpenTelemetry tracing, by starting off
6161
```python
6262
from azure.ai.projects import AIProjectClient
6363
from azure.identity import DefaultAzureCredential
64-
project_client = AIProjectClient.from_connection_string(
64+
project_client = AIProjectClient(
6565
credential=DefaultAzureCredential(),
6666
endpoint=os.environ["PROJECT_ENDPOINT"],
6767
)
@@ -71,12 +71,8 @@ Next, retrieve the connection string from the Application Insights resource conn
7171

7272
```python
7373
from azure.monitor.opentelemetry import configure_azure_monitor
74-
connection_string = project_client.telemetry.get_connection_string()
75-
76-
if not connection_string:
77-
print("Application Insights is not enabled. Enable by going to Tracing in your Azure AI Foundry project.")
78-
exit()
7974

75+
connection_string = project_client.telemetry.get_application_insights_connection_string()
8076
configure_azure_monitor(connection_string=connection_string) #enable telemetry collection
8177
```
8278

@@ -92,11 +88,11 @@ with tracer.start_as_current_span("example-tracing"):
9288
name="my-assistant",
9389
instructions="You are a helpful assistant"
9490
)
95-
thread = project_client.agents.create_thread()
96-
message = project_client.agents.create_message(
91+
thread = project_client.agents.threads.create()
92+
message = project_client.agents.messages.create(
9793
thread_id=thread.id, role="user", content="Tell me a joke"
9894
)
99-
run = project_client.agents.create_run(thread_id=thread.id, agent_id=agent.id)
95+
run = project_client.agents.runs.create_and_process(thread_id=thread.id, agent_id=agent.id)
10096
```
10197

10298
After running your agent, you can go begin to [view traces in Azure AI Foundry Portal](#view-traces-in-azure-ai-foundry-portal).
@@ -108,7 +104,8 @@ To connect to [Aspire Dashboard](https://aspiredashboard.com/#start) or another
108104
```bash
109105
pip install azure-core-tracing-opentelemetry opentelemetry-exporter-otlp opentelemetry-sdk
110106
```
111-
Next, you want to configure tracing for your application.
107+
108+
Next, configure tracing for console output:
112109

113110
```python
114111
from azure.core.settings import settings
@@ -124,18 +121,16 @@ tracer_provider = TracerProvider()
124121
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter))
125122
trace.set_tracer_provider(tracer_provider)
126123
```
127-
Use `enable_telemetry` to begin collecting telemetry.
128124

129-
```python
130-
from azure.ai.projects import enable_telemetry
131-
enable_telemetry(destination=sys.stdout)
125+
Or modify the above code, based on [Aspire Dashboard](https://aspiredashboard.com/#start), to trace to a local OTLP viewer.
126+
127+
Now enable Agent instrumentation and run your Agent:
132128

133-
# Logging to an OTLP endpoint, change the destination to
134-
# enable_telemetry(destination="http://localhost:4317")
135-
```
136129
```python
130+
from azure.ai.agents.telemetry import AIAgentsInstrumentor
131+
AIAgentsInstrumentor().instrument()
132+
137133
# Start tracing
138-
from opentelemetry import trace
139134
tracer = trace.get_tracer(__name__)
140135

141136
with tracer.start_as_current_span("example-tracing"):
@@ -144,11 +139,11 @@ with tracer.start_as_current_span("example-tracing"):
144139
name="my-assistant",
145140
instructions="You are a helpful assistant"
146141
)
147-
thread = project_client.agents.create_thread()
148-
message = project_client.agents.create_message(
142+
thread = project_client.agents.threads.create()
143+
message = project_client.agents.messages.create(
149144
thread_id=thread.id, role="user", content="Tell me a joke"
150145
)
151-
run = project_client.agents.create_run(thread_id=thread.id, agent_id=agent.id)
146+
run = project_client.agents.runs.create_and_process(thread_id=thread.id, agent_id=agent.id)
152147
```
153148

154149
## Trace custom functions

articles/ai-foundry/how-to/develop/trace-application.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ When developing with the OpenAI SDK, you can instrument your code so traces are
9595
endpoint="https://<your-resource>.services.ai.azure.com/api/projects/<your-project>",
9696
)
9797

98-
connection_string = project_client.telemetry.get_connection_string()
98+
connection_string = project_client.telemetry.get_application_insights_connection_string()
9999
```
100100

101101
> [!TIP]
@@ -116,7 +116,7 @@ When developing with the OpenAI SDK, you can instrument your code so traces are
116116
1. Use the OpenAI SDK as usual:
117117

118118
```python
119-
client = project_client.inference.get_azure_openai_client()
119+
client = project_client.get_openai_client()
120120

121121
response = client.chat.completions.create(
122122
model="deepseek-v3-0324",

0 commit comments

Comments
 (0)