- Python
- TypeScript
Given the wide range of providers and SDKs, The
arize-phoenix-evals provides an LLM abstraction that delegates LLM calls to an appropriate SDK/API that is already available in your Python environment. The configuration arguments of the SDK client and LLM call invocation parameters will be the same as the target SDK so you won’t have to learn another API.To see the currently supported LLM providers, use the show_provider_availability function.provider column shows the supported providers, and the status column will read “Available” if the required dependencies are installed in the active Python environment. Note that multiple client SDKs can be used to make LLM requests to a provider, the desired client SDK can be specified when constructing the LLM wrapper client.Client Configuration
- Python
- TypeScript
The Similarly for OpenAI’s Azure Python Client:
LLM wrappers can be configured the same way you’d configure the underlying client SDK. For example, when using the OpenAI Python Client:Unified Interface
- Python
- TypeScript
The
LLM wrapper provides a unified interface to common LLM operations: generating text and structured outputs. For more information, refer to the API Documentation.
