Use the register function to connect your application to Phoenix:
Copy
Ask AI
from phoenix.otel import register# configure the Phoenix tracertracer_provider = register( project_name="my-llm-app", # Default is 'default' auto_instrument=True # Auto-instrument your app based on installed OI dependencies)
Phoenix’s auto-instrumentor collects any traces from Haystack Pipelines. If you are using Haystack but not using Pipelines, you won’t see traces appearing in Phoenix automatically.If you don’t want to use Haystack pipelines but still want tracing in Phoenix, you can use instead of this auto-instrumentor.
From here, you can set up your Haystack app as normal:
Copy
Ask AI
from haystack import Pipelinefrom haystack.components.generators import OpenAIGeneratorfrom haystack.components.builders.prompt_builder import PromptBuilderprompt_template = """Answer the following question.Question: {{question}}Answer:"""# Initialize the pipelinepipeline = Pipeline()# Initialize the OpenAI generator componentllm = OpenAIGenerator(model="gpt-3.5-turbo")prompt_builder = PromptBuilder(template=prompt_template)# Add the generator component to the pipelinepipeline.add_component("prompt_builder", prompt_builder)pipeline.add_component("llm", llm)pipeline.connect("prompt_builder", "llm")# Define the questionquestion = "What is the location of the Hanging Gardens of Babylon?"