Skip to content

Commit a517f30

Browse files
authored
Update langchain.md
1 parent 617994c commit a517f30

File tree

1 file changed

+13
-4
lines changed

1 file changed

+13
-4
lines changed

articles/ai-studio/how-to/develop/langchain.md

Lines changed: 13 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -381,17 +381,21 @@ Use the client as usual in your code.
381381
382382
## Tracing
383383
384-
You can use the tracing capabilities in Azure AI Foundry by creating a tracer. Logs are stored in Azure Application Insights and can be queried at any time and hence you need a connection string to it. Each AI Hub has an Azure Application Insights created for you. You can get the connection string by either:
384+
You can use the tracing capabilities in Azure AI Foundry by creating a tracer. Logs are stored in Azure Application Insights and can be queried at any time using Azure Monitor or Azure AI Foundry portal. Each AI Hub has an Azure Application Insights created for you.
385385
386-
1. Using the connection string directly:
386+
### Get your instrumentation connection string
387+
388+
You can configure your application to send telemetry to Azure Application Insights either by:
389+
390+
1. Using the connection string to Azure Application Insights directly:
387391
388392
```python
389393
import os
390394
391395
application_insights_connection_string = "instrumentation...."
392396
```
393397
394-
2. Using the Azure AI Foundry SDK and the project connection string (in the landing page of your project).
398+
2. Using the Azure AI Foundry SDK and the project connection string. You can find the project's connection string by navigating to the landing page of your project.
395399
396400
```python
397401
from azure.ai.projects import AIProjectClient
@@ -405,9 +409,12 @@ You can use the tracing capabilities in Azure AI Foundry by creating a tracer. L
405409
application_insights_connection_string = project_client.telemetry.get_connection_string()
406410
```
407411
412+
> [!TIP]
413+
> This snippet of code requires the package `azure-ai-projects` installed.
414+
408415
### Configure tracing for Azure AI Foundry
409416
410-
The following code creates a tracer connected to a project in Azure AI Foundry. Notice that the parameter `enable_content_recording` is set to `True`. This enables the capture of the inputs and outputs of the entire application as well as the intermediate steps. Such is helpful when debugging and building applications, but you may want to disable it on production environments. By default, the environment variable `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED`:
417+
The following code creates a tracer connected to the Azure Application Insights behind a project in Azure AI Foundry. Notice that the parameter `enable_content_recording` is set to `True`. This enables the capture of the inputs and outputs of the entire application as well as the intermediate steps. Such is helpful when debugging and building applications, but you may want to disable it on production environments. It defaults to the environment variable `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED`:
411418
412419
```python
413420
from langchain_azure_ai.callbacks.tracers import AzureAIInferenceTracer
@@ -448,9 +455,11 @@ To see traces:
448455
449456
:::image type="content" source="../../media/how-to/develop-langchain/langchain-portal-tracing-example.png" alt-text="A screenshot showing the trace of a chain." lightbox="../../media/how-to/develop-langchain/langchain-portal-tracing-example.png":::
450457
458+
Learn more about [how to visualize and manage traces](visualize-traces.md).
451459
452460
## Next steps
453461
454462
* [Develop applications with LlamaIndex](llama-index.md)
463+
* [Visualize and manage traces in Azure AI Foundry](visualize-traces.md)
455464
* [Use the Azure AI model inference service](../../ai-services/model-inference.md)
456465
* [Reference: Azure AI model inference API](../../reference/reference-model-inference-api.md)

0 commit comments

Comments
 (0)