AI 工具包中的跟踪
AI 工具包提供跟踪功能,可帮助你监控和分析 AI 应用程序的性能。你可以跟踪 AI 应用程序的执行,包括与生成式 AI 模型的交互,以深入了解它们的行为和性能。
AI 工具包托管一个本地 HTTP 和 gRPC 服务器来收集跟踪数据。收集器服务器与 OTLP (OpenTelemetry Protocol) 兼容,并且大多数语言模型 SDK 直接支持 OTLP,或者具有非 Microsoft 检测库来支持它。使用 AI 工具包可视化收集到的检测数据。
所有支持 OTLP 并遵循生成式 AI 系统的语义约定的框架或 SDK 都受支持。下表包含经过兼容性测试的常见 AI SDK。
Azure AI 推理 | Azure AI Foundry 代理服务 | Anthropic | Gemini | LangChain | OpenAI SDK | OpenAI 代理 SDK | |
---|---|---|---|---|---|---|---|
Python | ✅ | ✅ | ✅ (traceloop)1,2 | ✅ | ✅ (LangSmith)1,2 | ✅ (opentelemetry-python-contrib)1 | ✅ (Logfire)1,2 |
TS/JS | ✅ | ✅ | ✅ (traceloop)1,2 | ❌ | ✅ (traceloop)1,2 | ✅ (traceloop)1,2 | ❌ |
- 括号中的 SDK 是非 Microsoft 工具,它们添加了 OTLP 支持,因为官方 SDK 不支持 OTLP。
- 这些工具不完全遵循生成式 AI 系统的 OpenTelemetry 规则。
如何开始跟踪
-
在树视图中选择跟踪,打开跟踪 Web 视图。
-
选择启动收集器按钮以启动本地 OTLP 跟踪收集器服务器。
-
使用代码片段启用检测。有关不同语言和 SDK 的代码片段,请参阅设置检测部分。
-
通过运行应用程序生成跟踪数据。
-
在跟踪 Web 视图中,选择刷新按钮以查看新的跟踪数据。
设置检测
在 AI 应用程序中设置跟踪以收集跟踪数据。以下代码片段展示了如何为不同的 SDK 和语言设置跟踪
所有 SDK 的过程都类似
- 将跟踪添加到你的 LLM 或代理应用程序。
- 设置 OTLP 跟踪导出器以使用 AITK 本地收集器。
Azure AI Inference SDK - Python
安装
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]
设置
import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"
from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from azure.ai.inference.tracing import AIInferenceInstrumentor
AIInferenceInstrumentor().instrument(True)
Azure AI Inference SDK - TypeScript/JavaScript
安装
npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node
设置
const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
NodeTracerProvider,
SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');
const exporter = new OTLPTraceExporter({
url: 'https://:4318/v1/traces'
});
const provider = new NodeTracerProvider({
resource: resourceFromAttributes({
'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
}),
spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();
const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');
registerInstrumentations({
instrumentations: [createAzureSdkInstrumentation()]
});
Azure AI Foundry Agent Service - Python
安装
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]
设置
import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"
from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from azure.ai.agents.telemetry import AIAgentsInstrumentor
AIAgentsInstrumentor().instrument(True)
Azure AI Foundry Agent Service - TypeScript/JavaScript
安装
npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node
设置
const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
NodeTracerProvider,
SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');
const exporter = new OTLPTraceExporter({
url: 'https://:4318/v1/traces'
});
const provider = new NodeTracerProvider({
resource: resourceFromAttributes({
'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
}),
spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();
const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');
registerInstrumentations({
instrumentations: [createAzureSdkInstrumentation()]
});
Anthropic - Python
安装
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic
设置
from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-anthropic-traceloop"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
AnthropicInstrumentor().instrument()
Anthropic - TypeScript/JavaScript
安装
npm install @traceloop/node-server-sdk
设置
const { initialize } = require('@traceloop/node-server-sdk');
const { trace } = require('@opentelemetry/api');
initialize({
appName: 'opentelemetry-instrumentation-anthropic-traceloop',
baseUrl: 'https://:4318',
disableBatch: true
});
Google Gemini - Python
安装
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai
设置
from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-google-genai"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)
LangChain - Python
安装
pip install langsmith[otel]
设置
import os
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://:4318"
LangChain - TypeScript/JavaScript
安装
npm install @traceloop/node-server-sdk
设置
const { initialize } = require('@traceloop/node-server-sdk');
initialize({
appName: 'opentelemetry-instrumentation-langchain-traceloop',
baseUrl: 'https://:4318',
disableBatch: true
});
OpenAI - Python
安装
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-openai-v2
设置
from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
# Set up resource
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-openai"
})
# Set up tracer provider
trace.set_tracer_provider(TracerProvider(resource=resource))
# Configure OTLP exporter
otlp_exporter = OTLPSpanExporter(
endpoint="https://:4318/v1/traces"
)
# Add span processor
trace.get_tracer_provider().add_span_processor(
BatchSpanProcessor(otlp_exporter)
)
# Set up logger provider
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))
# Enable OpenAI instrumentation
OpenAIInstrumentor().instrument()
OpenAI - TypeScript/JavaScript
安装
npm install @traceloop/instrumentation-openai @traceloop/node-server-sdk
设置
const { initialize } = require('@traceloop/node-server-sdk');
initialize({
appName: 'opentelemetry-instrumentation-openai-traceloop',
baseUrl: 'https://:4318',
disableBatch: true
});
OpenAI 代理 SDK - Python
安装
pip install logfire
设置
import logfire
import os
os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "https://:4318/v1/traces"
logfire.configure(
service_name="opentelemetry-instrumentation-openai-agents-logfire",
send_to_logfire=False,
)
logfire.instrument_openai_agents()
示例:使用 Azure AI Inference SDK 设置跟踪
以下端到端示例使用 Python 中的 Azure AI Inference SDK,并展示了如何设置跟踪提供程序和检测。
先决条件
要运行此示例,你需要以下先决条件
设置你的开发环境
使用以下说明部署包含运行此示例所需所有依赖项的预配置开发环境。
-
设置 GitHub 个人访问令牌
使用免费的 GitHub 模型作为示例模型。
打开 GitHub 开发者设置并选择生成新令牌。
重要令牌需要
models:read
权限,否则将返回未授权。令牌将发送到 Microsoft 服务。 -
创建环境变量
创建环境变量以使用以下代码片段之一将令牌设置为客户端代码的键。将
<your-github-token-goes-here>
替换为你的实际 GitHub 令牌。bash
export GITHUB_TOKEN="<your-github-token-goes-here>"
powershell
$Env:GITHUB_TOKEN="<your-github-token-goes-here>"
Windows 命令提示符
set GITHUB_TOKEN=<your-github-token-goes-here>
-
安装 Python 包
以下命令安装使用 Azure AI Inference SDK 进行跟踪所需的 Python 包
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]
-
设置跟踪
-
在你的计算机上为项目创建一个新的本地目录。
mkdir my-tracing-app
-
导航到你创建的目录。
cd my-tracing-app
-
在该目录中打开 Visual Studio Code
code .
-
-
创建 Python 文件
-
在
my-tracing-app
目录中,创建一个名为main.py
的 Python 文件。你将添加代码来设置跟踪并与 Azure AI Inference SDK 交互。
-
将以下代码添加到
main.py
并保存文件import os ### Set up for OpenTelemetry tracing ### os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true" os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry" from opentelemetry import trace, _events from opentelemetry.sdk.resources import Resource from opentelemetry.sdk.trace import TracerProvider from opentelemetry.sdk.trace.export import BatchSpanProcessor from opentelemetry.sdk._logs import LoggerProvider from opentelemetry.sdk._logs.export import BatchLogRecordProcessor from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter from opentelemetry.sdk._events import EventLoggerProvider from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter github_token = os.environ["GITHUB_TOKEN"] resource = Resource(attributes={ "service.name": "opentelemetry-instrumentation-azure-ai-inference" }) provider = TracerProvider(resource=resource) otlp_exporter = OTLPSpanExporter( endpoint="https://:4318/v1/traces", ) processor = BatchSpanProcessor(otlp_exporter) provider.add_span_processor(processor) trace.set_tracer_provider(provider) logger_provider = LoggerProvider(resource=resource) logger_provider.add_log_record_processor( BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs")) ) _events.set_event_logger_provider(EventLoggerProvider(logger_provider)) from azure.ai.inference.tracing import AIInferenceInstrumentor AIInferenceInstrumentor().instrument() ### Set up for OpenTelemetry tracing ### from azure.ai.inference import ChatCompletionsClient from azure.ai.inference.models import UserMessage from azure.ai.inference.models import TextContentItem from azure.core.credentials import AzureKeyCredential client = ChatCompletionsClient( endpoint = "https://models.inference.ai.azure.com", credential = AzureKeyCredential(github_token), api_version = "2024-08-01-preview", ) response = client.complete( messages = [ UserMessage(content = [ TextContentItem(text = "hi"), ]), ], model = "gpt-4.1", tools = [], response_format = "text", temperature = 1, top_p = 1, ) print(response.choices[0].message.content)
-
-
运行代码
-
在 Visual Studio Code 中打开一个新终端。
-
在终端中,使用命令
python main.py
运行代码。
-
-
在 AI 工具包中检查跟踪数据
运行代码并刷新跟踪 Web 视图后,列表中将出现新的跟踪。
选择跟踪以打开跟踪详细信息 Web 视图。
在左侧 span 树视图中检查应用程序的完整执行流。
在右侧 span 详细信息视图中选择一个 span,以在输入 + 输出选项卡中查看生成式 AI 消息。
选择元数据选项卡以查看原始元数据。
你学到了什么
在本文中,你学习了如何
- 使用 Azure AI Inference SDK 和 OpenTelemetry 在 AI 应用程序中设置跟踪。
- 配置 OTLP 跟踪导出器以将跟踪数据发送到本地收集器服务器。
- 运行应用程序以生成跟踪数据并在 AI 工具包 Web 视图中查看跟踪。
- 通过 OTLP 使用多个 SDK 和语言(包括 Python 和 TypeScript/JavaScript)以及非 Microsoft 工具进行跟踪。
- 使用提供的代码片段检测各种 AI 框架(Anthropic、Gemini、LangChain、OpenAI 等)。
- 使用跟踪 Web 视图 UI,包括启动收集器和刷新按钮,来管理跟踪数据。
- 设置你的开发环境,包括环境变量和包安装,以启用跟踪。
- 使用 span 树和详细信息视图分析应用程序的执行流,包括生成式 AI 消息流和元数据。