Skip to main content
04.18.2025

Tracing For MCP Client Server Applications

We’re excited to announce a powerful capability in the OpenInference OSS library openinference-instrumentation-mcp β€” seamless OTEL context propagation for MCP clients and servers.

What’s New?

This release introduces automatic distributed tracing for Anthropic’s Model Context Protocol (MCP). Using OpenTelemetry, you can now:
  • Propagate context across MCP client-server boundaries
  • Generate end-to-end traces of your AI system across services and languages
  • Gain full visibility into how models access and use external context
The openinference-instrumentation-mcp package handles this for you by:
  • Creating spans for MCP client operations
  • Injecting trace context into MCP requests
  • Extracting and continuing the trace context on the server
  • Associating the context with OTEL spans on the server side

Set up

  1. Instrument both MCP client and server with OpenTelemetry.
  2. Add the openinference-instrumentation-mcp package.
  3. Spans will propagate across services, appearing as a single connected trace in Phoenix.

phoenix/tutorials/mcp/tracing_between_mcp_client_and_server at main Β· Arize-ai/phoenix

GitHub

Walkthrough Video

Acknowledgments

Big thanks to Adrian Cole and Anuraag Agrawal for their contributions to this feature.