Install
Setup
Add your OpenAI API key as an environment variable:Run OpenRouter
Observe
Now that you have tracing setup, all invocations of OpenAI (completions, chat completions, embeddings) will be streamed to your running Phoenix for observability and evaluation.What Gets Traced
All OpenRouter model calls are automatically traced and include:- Request/response data and timing
- Model name and provider information
- Token usage and cost data (when supported)
- Error handling and debugging information
Common Issues
- API Key: Use your OpenRouter API key, not OpenAI’s
- Model Names: Use exact model names from OpenRouter’s documentation
- Rate Limits: Check your OpenRouter dashboard for usage limits
- Base URL: Ensure you’re using
https://openrouter.ai/api/v1as the base URL\

