Click Create Space, then follow the prompts to create and launch your space.
Install packages:
Copy
Ask AI
pip install arize-phoenix-otel
Set your Phoenix endpoint and API Key:From your new Phoenix Space
Create your API key from the Settings page
Copy your Hostname from the Settings page
In your code, set your endpoint and API key:
Copy
Ask AI
import osos.environ["PHOENIX_API_KEY"] = "ADD YOUR PHOENIX API KEY"os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "ADD YOUR PHOENIX HOSTNAME"# If you created your Phoenix Cloud instance before June 24th, 2025,# you also need to set the API key as a header:# os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
Use the register function to connect your application to Phoenix:
Copy
Ask AI
from phoenix.otel import register# configure the Phoenix tracertracer_provider = register( project_name="my-llm-app", # Default is 'default' auto_instrument=True # Auto-instrument your app based on installed OI dependencies)
Phoenix’s auto-instrumentor collects any traces from Haystack Pipelines. If you are using Haystack but not using Pipelines, you won’t see traces appearing in Phoenix automatically.If you don’t want to use Haystack pipelines but still want tracing in Phoenix, you can use instead of this auto-instrumentor.
From here, you can set up your Haystack app as normal:
Copy
Ask AI
from haystack import Pipelinefrom haystack.components.generators import OpenAIGeneratorfrom haystack.components.builders.prompt_builder import PromptBuilderprompt_template = """Answer the following question.Question: {{question}}Answer:"""# Initialize the pipelinepipeline = Pipeline()# Initialize the OpenAI generator componentllm = OpenAIGenerator(model="gpt-3.5-turbo")prompt_builder = PromptBuilder(template=prompt_template)# Add the generator component to the pipelinepipeline.add_component("prompt_builder", prompt_builder)pipeline.add_component("llm", llm)pipeline.connect("prompt_builder", "llm")# Define the questionquestion = "What is the location of the Hanging Gardens of Babylon?"