https://www.npmjs.com/package/@arizeai/phoenix-client
Installation
Configuration
The client will automatically read environment variables from your environment, if available. The following environment variables are used:-
PHOENIX_HOST- The base URL of the Phoenix API. -
PHOENIX_API_KEY- The API key to use for authentication. -
PHOENIX_CLIENT_HEADERS- Custom headers to add to all requests. A JSON stringified object.
Prompts
@arizeai/phoenix-client provides a prompts export that exposes utilities for working with prompts for LLMs.
Creating a Prompt and push it to Phoenix
ThecreatePrompt function can be used to create a prompt in Phoenix for version control and reuse.
Pulling a Prompt from Phoenix
ThegetPrompt function can be used to pull a prompt from Phoenix based on some Prompt Identifier and returns it in the Phoenix SDK Prompt type.
Using a Phoenix Prompt with an LLM Provider SDK
ThetoSDK helper function can be used to convert a Phoenix Prompt to the format expected by an LLM provider SDK. You can then use the LLM provider SDK as normal, with your prompt.
If your Prompt is saved in Phoenix as openai, you can use the toSDK function to convert the prompt to the format expected by OpenAI, or even Anthropic and Vercel AI SDK. We will do a best effort conversion to your LLM provider SDK of choice.
The following LLM provider SDKs are supported:
-
Vercel AI SDK:
aiai -
OpenAI:
openaiopenai -
Anthropic:
anthropic@anthropic-ai/sdk
REST Endpoints
The client provides a REST API for all endpoints defined in the Phoenix OpenAPI spec. Endpoints are accessible via strongly-typed string literals and TypeScript auto-completion inside of the client object.Examples
To run examples, install dependencies usingpnpm and run:
Compatibility
This package utilizes openapi-ts to generate the types from the Phoenix OpenAPI spec. Because of this, this package only works with thearize-phonix server 8.0.0 and above.
Compatibility Table:
| Phoenix Client Version | Phoenix Server Version |
|---|---|
| ^1.0.0 | ^8.0.0 |

