Skip to main content
Traces let you see what’s happening inside your application—LLM calls, tool invocations, retrieval steps, and more. This guide walks you through sending your first traces to Phoenix.
1
Set environment variables to connect to your Phoenix instance:
export PHOENIX_API_KEY="your-api-key"

# Local (default, no API key required)
export PHOENIX_COLLECTOR_ENDPOINT="http://localhost:6006"

# Phoenix Cloud
# export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space-name"

# Self-hosted
# export PHOENIX_COLLECTOR_ENDPOINT="https://your-phoenix-instance.com"
You can find your collector endpoint and API key in the Settings page of your Phoenix instance.
2
Install the Phoenix OTEL package and an instrumentation library. This quickstart uses OpenAI, but Phoenix supports many providers and frameworks including Anthropic, LangChain, LlamaIndex, Vercel AI SDK, Mastra, and more.
pip install arize-phoenix-otel openinference-instrumentation-openai openai
See all Python integrations.
3
Register Phoenix as your trace provider. This connects your app to Phoenix and instruments supported libraries.
from phoenix.otel import register

tracer_provider = register(
    project_name="my-llm-app",
    auto_instrument=True,  # Auto-instruments OpenAI, LangChain, etc.
)
See the Python SDK reference for all options.
4
Make an LLM call to generate your first trace. OpenAI calls are automatically captured:
import openai

client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Why is the sky blue?"}],
)
print(response.choices[0].message.content)
Use @tracer.agent, @tracer.tool, and @tracer.chain decorators to create custom spans around your own functions. See the tracing guide for details.
5
Open Phoenix to see your traces:
Phoenix Traces View

Next Steps