The Phoenix OTEL SDK provides a lightweight wrapper around OpenTelemetry with sensible defaults for Phoenix.
Install
pip install arize-phoenix-otel
npm install @arizeai/phoenix-otel
Set environment variables to connect to your Phoenix instance:
export PHOENIX_API_KEY = "your-api-key"
# Local (default, no API key required)
export PHOENIX_COLLECTOR_ENDPOINT = "http://localhost:6006"
# Phoenix Cloud
# export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space-name"
# Self-hosted
# export PHOENIX_COLLECTOR_ENDPOINT="https://your-phoenix-instance.com"
You can find your collector endpoint and API key in the Settings page of your Phoenix instance.
Register
Call register() to initialize tracing. The SDK automatically reads your environment variables.
from phoenix.otel import register
tracer_provider = register(
project_name = "my-llm-app" ,
auto_instrument = True , # automatically instruments OpenAI, LangChain, etc.
)
Parameter Description project_nameProject name in Phoenix (or PHOENIX_PROJECT_NAME env var) auto_instrumentAutomatically instrument all supported libraries batchProcess spans in batch (default: True, recommended for production) endpointCustom collector endpoint URL protocolTransport protocol: "grpc" or "http/protobuf" headersHeaders to send with each span payload
import { register } from "@arizeai/phoenix-otel" ;
register ({ projectName: "my-llm-app" });
Parameter Type Default Description projectNamestring"default"Project name in Phoenix urlstring"http://localhost:6006"Phoenix server URL apiKeystring— API key for authentication batchbooleantrueUse batch span processing headersRecord<string, string>{}Custom headers for OTLP requests instrumentationsInstrumentation[]— Instrumentations to register diagLogLevelDiagLogLevel— Enable diagnostic logging
Instrument
Add instrumentation to capture traces from your LLM calls:
With auto_instrument=True, Phoenix automatically discovers and activates all OpenInference instrumentor packages installed in your Python environment—no additional code required. # Install instrumentors for your frameworks
pip install openinference-instrumentation-openai
pip install openinference-instrumentation-langchain
# ... any other OpenInference packages you need
Just pip install the instrumentation packages you need and set auto_instrument=True. Phoenix handles the rest.
See Integrations for all available packages, or use Tracing Helpers for manual instrumentation. Install and register instrumentations for your framework: npm install @arizeai/openinference-instrumentation-openai
import OpenAI from "openai" ;
import { register , registerInstrumentations } from "@arizeai/phoenix-otel" ;
import { OpenAIInstrumentation } from "@arizeai/openinference-instrumentation-openai" ;
register ({ projectName: "my-llm-app" });
const instrumentation = new OpenAIInstrumentation ();
instrumentation . manuallyInstrument ( OpenAI );
registerInstrumentations ({
instrumentations: [ instrumentation ],
});
ESM projects require calling manuallyInstrument() on the client class. CommonJS projects can skip this step.
See Integrations for all available packages.
Advanced Configuration
For more control over tracing behavior, see the SDK reference documentation:
Next Steps