You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/langchain.md
+13-4Lines changed: 13 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -381,17 +381,21 @@ Use the client as usual in your code.
381
381
382
382
## Tracing
383
383
384
-
You can use the tracing capabilities in Azure AI Foundry by creating a tracer. Logs are stored in Azure Application Insights and can be queried at any timeand hence you need a connection string to it. Each AI Hub has an Azure Application Insights created for you. You can get the connection string by either:
384
+
You can use the tracing capabilities in Azure AI Foundry by creating a tracer. Logs are stored in Azure Application Insights and can be queried at any timeusing Azure Monitor or Azure AI Foundry portal. Each AI Hub has an Azure Application Insights created for you.
385
385
386
-
1. Using the connection string directly:
386
+
### Get your instrumentation connection string
387
+
388
+
You can configure your application to send telemetry to Azure Application Insights either by:
389
+
390
+
1. Using the connection string to Azure Application Insights directly:
2. Using the Azure AI Foundry SDK and the project connection string (in the landing page of your project).
398
+
2. Using the Azure AI Foundry SDK and the project connection string. You can find the project's connection string by navigating to the landing page of your project.
395
399
396
400
```python
397
401
from azure.ai.projects import AIProjectClient
@@ -405,9 +409,12 @@ You can use the tracing capabilities in Azure AI Foundry by creating a tracer. L
> This snippet of code requires the package `azure-ai-projects` installed.
414
+
408
415
### Configure tracing for Azure AI Foundry
409
416
410
-
The following code creates a tracer connected to a project in Azure AI Foundry. Notice that the parameter `enable_content_recording` is set to `True`. This enables the capture of the inputs and outputs of the entire application as well as the intermediate steps. Such is helpful when debugging and building applications, but you may want to disable it on production environments. By default, the environment variable `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED`:
417
+
The following code creates a tracer connected to the Azure Application Insights behind a project in Azure AI Foundry. Notice that the parameter `enable_content_recording` is set to `True`. This enables the capture of the inputs and outputs of the entire application as well as the intermediate steps. Such is helpful when debugging and building applications, but you may want to disable it on production environments. It defaults to the environment variable `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED`:
411
418
412
419
```python
413
420
from langchain_azure_ai.callbacks.tracers import AzureAIInferenceTracer
@@ -448,9 +455,11 @@ To see traces:
448
455
449
456
:::image type="content" source="../../media/how-to/develop-langchain/langchain-portal-tracing-example.png" alt-text="A screenshot showing the trace of a chain." lightbox="../../media/how-to/develop-langchain/langchain-portal-tracing-example.png":::
450
457
458
+
Learn more about [how to visualize and manage traces](visualize-traces.md).
451
459
452
460
## Next steps
453
461
454
462
* [Develop applications with LlamaIndex](llama-index.md)
463
+
* [Visualize and manage traces in Azure AI Foundry](visualize-traces.md)
455
464
* [Use the Azure AI model inference service](../../ai-services/model-inference.md)
456
465
* [Reference: Azure AI model inference API](../../reference/reference-model-inference-api.md)
0 commit comments