Troubleshooting an LLM application using the OpenInferenceTraceCallback
LlamaIndexInstrumentor . This callback is what is used to create spans and send them to the Phoenix collector.
Launch Phoenix
Phoenix supports LlamaIndex’s latest instrumentation paradigm. This paradigm requires LlamaIndex >= 0.10.43. For legacy support, see below.- Phoenix Cloud
- Command Line
- Docker
- Notebook
Sign up for Phoenix:Set your Phoenix endpoint and API Key:From your new Phoenix Space
- Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
-
Click
Create Space, then follow the prompts to create and launch your space.
- Create your API key from the Settings page
-
Copy your
Hostnamefrom the Settings page - In your code, set your endpoint and API key:
Having trouble finding your endpoint? Check out Finding your Phoenix Endpoint
Install
Setup
Initialize the LlamaIndexInstrumentor before your application code.Run LlamaIndex
You can now use LlamaIndex as normal, and tracing will be automatically captured and sent to your Phoenix instance.Observe
View your traces in Phoenix:Resources
Legacy Integrations (<0.10.43)
Legacy Integrations (<0.10.43)
Legacy One-Click (<0.10.43)Using phoenix as a callback requires an install of `llama-index-callbacks-arize-phoenix>0.1.3’llama-index 0.10 introduced modular sub-packages. To use llama-index’s one click, you must install the small integration first:Legacy (<0.10.0)If you are using an older version of llamaIndex (pre-0.10), you can still use phoenix. You will have to be using
arize-phoenix>3.0.0 and downgrade openinference-instrumentation-llama-index<1.0.0
