参加你附近的 ,了解 VS Code 中的 AI 辅助开发。

AI Toolkit 中的追踪

AI Toolkit 提供追踪功能,帮助您监控和分析 AI 应用程序的性能。您可以追踪 AI 应用程序的执行情况,包括与生成式 AI 模型的交互,从而深入了解它们的行为和性能。

AI Toolkit 托管一个本地 HTTP 和 gRPC 服务器来收集追踪数据。收集器服务器与 OTLP (OpenTelemetry Protocol) 兼容,大多数语言模型 SDK 要么直接支持 OTLP,要么拥有非 Microsoft 的检测库来支持它。使用 AI Toolkit 可视化收集到的检测数据。

所有支持 OTLP 并遵循生成式 AI 系统语义约定的框架或 SDK 均受支持。下表包含经过兼容性测试的常见 AI SDK。

Azure AI 推理 Azure AI Foundry 代理服务 Anthropic Gemini LangChain OpenAI SDK 3 OpenAI 代理 SDK
Python ✅ (traceloop)1,2 ✅ (LangSmith)1,2 ✅ (opentelemetry-python-contrib)1 ✅ (Logfire)1,2
TS/JS ✅ (traceloop)1,2 ✅ (traceloop)1,2 ✅ (traceloop)1,2
  1. 括号中的 SDK 是非 Microsoft 工具,它们添加了 OTLP 支持,因为官方 SDK 不支持 OTLP。
  2. 这些工具不完全遵循 OpenTelemetry 生成式 AI 系统的规则。
  3. 对于 OpenAI SDK,仅支持聊天完成 API。尚不支持响应 API

如何开始追踪

  1. 通过在树视图中选择追踪来打开追踪 webview。

  2. 选择启动收集器按钮以启动本地 OTLP 追踪收集器服务器。

    Screenshot showing the Start Collector button in the Tracing webview.

  3. 使用代码片段启用检测。有关不同语言和 SDK 的代码片段,请参阅设置检测部分。

  4. 通过运行您的应用程序生成追踪数据。

  5. 在追踪 webview 中,选择刷新按钮以查看新的追踪数据。

    Screenshot showing the Trace List in the Tracing webview.

设置检测

在您的 AI 应用程序中设置追踪以收集追踪数据。以下代码片段展示了如何为不同的 SDK 和语言设置追踪。

所有 SDK 的过程都相似

  • 将追踪添加到您的 LLM 或代理应用程序中。
  • 设置 OTLP 追踪导出器以使用 AITK 本地收集器。
Azure AI 推理 SDK - Python

安装

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]

设置

import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from azure.ai.inference.tracing import AIInferenceInstrumentor
AIInferenceInstrumentor().instrument(True)
Azure AI 推理 SDK - TypeScript/JavaScript

安装

npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node

设置

const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
  NodeTracerProvider,
  SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');

const exporter = new OTLPTraceExporter({
  url: 'https://:4318/v1/traces'
});
const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
  }),
  spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();

const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
  createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');

registerInstrumentations({
  instrumentations: [createAzureSdkInstrumentation()]
});
Azure AI Foundry 代理服务 - Python

安装

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]

设置

import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from azure.ai.agents.telemetry import AIAgentsInstrumentor
AIAgentsInstrumentor().instrument(True)
Azure AI Foundry 代理服务 - TypeScript/JavaScript

安装

npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node

设置

const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
  NodeTracerProvider,
  SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');

const exporter = new OTLPTraceExporter({
  url: 'https://:4318/v1/traces'
});
const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
  }),
  spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();

const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
  createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');

registerInstrumentations({
  instrumentations: [createAzureSdkInstrumentation()]
});
Anthropic - Python

安装

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic

设置

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-anthropic-traceloop"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
AnthropicInstrumentor().instrument()
Anthropic - TypeScript/JavaScript

安装

npm install @traceloop/node-server-sdk

设置

const { initialize } = require('@traceloop/node-server-sdk');
const { trace } = require('@opentelemetry/api');

initialize({
  appName: 'opentelemetry-instrumentation-anthropic-traceloop',
  baseUrl: 'https://:4318',
  disableBatch: true
});
Google Gemini - Python

安装

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai

设置

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-google-genai"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)
LangChain - Python

安装

pip install langsmith[otel]

设置

import os
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://:4318"
LangChain - TypeScript/JavaScript

安装

npm install @traceloop/node-server-sdk

设置

const { initialize } = require('@traceloop/node-server-sdk');
initialize({
  appName: 'opentelemetry-instrumentation-langchain-traceloop',
  baseUrl: 'https://:4318',
  disableBatch: true
});
OpenAI - Python

安装

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-openai-v2

设置

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
import os

os.environ["OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT"] = "true"

# Set up resource
resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-openai"
})

# Set up tracer provider
trace.set_tracer_provider(TracerProvider(resource=resource))

# Configure OTLP exporter
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces"
)

# Add span processor
trace.get_tracer_provider().add_span_processor(
    BatchSpanProcessor(otlp_exporter)
)

# Set up logger provider
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

# Enable OpenAI instrumentation
OpenAIInstrumentor().instrument()
OpenAI - TypeScript/JavaScript

安装

npm install @traceloop/instrumentation-openai @traceloop/node-server-sdk

设置

const { initialize } = require('@traceloop/node-server-sdk');
initialize({
  appName: 'opentelemetry-instrumentation-openai-traceloop',
  baseUrl: 'https://:4318',
  disableBatch: true
});
OpenAI 代理 SDK - Python

安装

pip install logfire

设置

import logfire
import os

os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "https://:4318/v1/traces"

logfire.configure(
    service_name="opentelemetry-instrumentation-openai-agents-logfire",
    send_to_logfire=False,
)
logfire.instrument_openai_agents()

示例:使用 Azure AI 推理 SDK 设置追踪

以下端到端示例使用 Python 中的 Azure AI 推理 SDK,并展示了如何设置追踪提供程序和检测。

先决条件

要运行此示例,您需要以下先决条件

设置您的开发环境

使用以下说明部署包含运行此示例所需所有依赖项的预配置开发环境。

  1. 设置 GitHub 个人访问令牌

    使用免费的GitHub 模型作为示例模型。

    打开GitHub 开发人员设置并选择生成新令牌

    重要

    令牌需要models:read权限,否则将返回未经授权。令牌将发送到 Microsoft 服务。

  2. 创建环境变量

    创建环境变量以使用以下代码片段之一将您的令牌设置为客户端代码的密钥。将<your-github-token-goes-here>替换为您的实际 GitHub 令牌。

    bash

    export GITHUB_TOKEN="<your-github-token-goes-here>"
    

    powershell

    $Env:GITHUB_TOKEN="<your-github-token-goes-here>"
    

    Windows 命令提示符

    set GITHUB_TOKEN=<your-github-token-goes-here>
    
  3. 安装 Python 包

    以下命令安装使用 Azure AI 推理 SDK 进行追踪所需的 Python 包

    pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]
    
  4. 设置追踪

    1. 在您的计算机上为项目创建一个新的本地目录。

      mkdir my-tracing-app
      
    2. 导航到您创建的目录。

      cd my-tracing-app
      
    3. 在该目录中打开 Visual Studio Code

      code .
      
  5. 创建 Python 文件

    1. my-tracing-app目录中,创建一个名为main.py的 Python 文件。

      您将添加代码来设置追踪并与 Azure AI 推理 SDK 进行交互。

    2. 将以下代码添加到main.py并保存文件

      import os
      
      ### Set up for OpenTelemetry tracing ###
      os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
      os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"
      
      from opentelemetry import trace, _events
      from opentelemetry.sdk.resources import Resource
      from opentelemetry.sdk.trace import TracerProvider
      from opentelemetry.sdk.trace.export import BatchSpanProcessor
      from opentelemetry.sdk._logs import LoggerProvider
      from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
      from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
      from opentelemetry.sdk._events import EventLoggerProvider
      from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
      
      github_token = os.environ["GITHUB_TOKEN"]
      
      resource = Resource(attributes={
          "service.name": "opentelemetry-instrumentation-azure-ai-inference"
      })
      provider = TracerProvider(resource=resource)
      otlp_exporter = OTLPSpanExporter(
          endpoint="https://:4318/v1/traces",
      )
      processor = BatchSpanProcessor(otlp_exporter)
      provider.add_span_processor(processor)
      trace.set_tracer_provider(provider)
      
      logger_provider = LoggerProvider(resource=resource)
      logger_provider.add_log_record_processor(
          BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
      )
      _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
      
      from azure.ai.inference.tracing import AIInferenceInstrumentor
      AIInferenceInstrumentor().instrument()
      ### Set up for OpenTelemetry tracing ###
      
      from azure.ai.inference import ChatCompletionsClient
      from azure.ai.inference.models import UserMessage
      from azure.ai.inference.models import TextContentItem
      from azure.core.credentials import AzureKeyCredential
      
      client = ChatCompletionsClient(
          endpoint = "https://models.inference.ai.azure.com",
          credential = AzureKeyCredential(github_token),
          api_version = "2024-08-01-preview",
      )
      
      response = client.complete(
          messages = [
              UserMessage(content = [
                  TextContentItem(text = "hi"),
              ]),
          ],
          model = "gpt-4.1",
          tools = [],
          response_format = "text",
          temperature = 1,
          top_p = 1,
      )
      
      print(response.choices[0].message.content)
      
  6. 运行代码

    1. 在 Visual Studio Code 中打开一个新终端。

    2. 在终端中,使用命令python main.py运行代码。

  7. 在 AI Toolkit 中检查追踪数据

    运行代码并刷新追踪 webview 后,列表中会有一个新的追踪。

    选择追踪以打开追踪详细信息 webview。

    Screenshot showing selecting a trace from the Trace List in the Tracing webview.

    在左侧的 span 树视图中检查应用程序的完整执行流。

    在右侧的 span 详细信息视图中选择一个 span,以在输入 + 输出选项卡中查看生成式 AI 消息。

    选择元数据选项卡以查看原始元数据。

    Screenshot showing the Trace Details view in the Tracing webview.

您学到了什么

在本文中,您学习了如何

  • 使用 Azure AI 推理 SDK 和 OpenTelemetry 在您的 AI 应用程序中设置追踪。
  • 配置 OTLP 追踪导出器以将追踪数据发送到本地收集器服务器。
  • 运行您的应用程序以生成追踪数据并在 AI Toolkit webview 中查看追踪。
  • 通过 OTLP 使用多种 SDK 和语言(包括 Python 和 TypeScript/JavaScript)以及非 Microsoft 工具的追踪功能。
  • 使用提供的代码片段检测各种 AI 框架(Anthropic、Gemini、LangChain、OpenAI 等)。
  • 使用追踪 webview UI,包括启动收集器刷新按钮,管理追踪数据。
  • 设置您的开发环境,包括环境变量和包安装,以启用追踪。
  • 使用 span 树和详细信息视图分析应用程序的执行流,包括生成式 AI 消息流和元数据。