Select Language:
Here’s a simple guide to help you set up your tracing framework with Azure AI, especially if you’re working with Python and Langchain. The goal is to ensure your code runs smoothly and you can properly monitor your AI models.
First, when installing the required packages, be sure to include the specific Azure monitoring packages. Instead of only installing the general modules, add these two to your command:
bash
pip install azure-monitor-opentelemetry-exporter azure-monitor-opentelemetry
These packages are essential for enabling Azure Monitor to collect telemetry data from your application.
Next, if you’re using Langchain version 1.0, you might notice that the integration appears more straightforward, but it may not work perfectly with your current code. To avoid compatibility issues, consider sticking with version 0.3 of Langchain. This version allows you to explicitly add certain modules to your model, ensuring everything functions correctly.
Here’s an example of how to set up Azure OpenTelemetry tracing with Langchain:
python
from langchain_azure_ai.callbacks.tracers import AzureAIOpenTelemetryTracer
azure_tracer = AzureAIOpenTelemetryTracer(
connection_string=conn,
enable_content_recording=True,
name=”Weather information agent”
)
tracers = [azure_tracer]
Make sure you replace conn with your actual connection string.
In summary, install the necessary Azure monitoring packages explicitly, and if you’re facing issues with Langchain 1.0, revert to version 0.3 to maintain compatibility. Setting up the tracer as shown will help you effectively monitor your AI application’s performance and troubleshoot where necessary.




