Skip to main content
Phoenix integrates with the leading AI frameworks, LLM providers, and tools to provide seamless observability, evaluation, and debugging for your AI applications. Whether you’re building with Python, TypeScript, or Java, Phoenix has you covered.
Don’t see an integration you need? We’d love to hear from you!

Integration Types

Phoenix offers several types of integrations to support your AI development workflow:

Tracing Integrations

Phoenix captures detailed traces from your AI applications, giving you visibility into every step of your LLM pipeline.

By Language

LLM Providers

Phoenix provides native tracing support for all major LLM providers:

Platforms

Integrate Phoenix with AI development platforms and infrastructure:

Eval Model Integrations

Phoenix’s evaluation library (phoenix-evals) can use any LLM provider to power evaluations. These models score, classify, and analyze your traces.

Eval Library Integrations

Use external evaluation libraries alongside Phoenix to get the best of both worlds:

Vector Database Integrations

Connect Phoenix with your vector database for embedding analysis and retrieval debugging:

Span Processors

Normalize and convert data from other instrumentation libraries by adding span processors that unify traces to the OpenInference format:

Phoenix MCP Server

Phoenix provides a Model Context Protocol (MCP) server that enables AI assistants to interact directly with your Phoenix instance:

Phoenix MCP Server