Click Create Space, then follow the prompts to create and launch your space.
Install packages:
Copy
Ask AI
pip install arize-phoenix-otel
Set your Phoenix endpoint and API Key:From your new Phoenix Space
1
Create your API key from the Settings page
2
Copy your Hostname from the Settings page
3
In your code, set your endpoint and API key:
Copy
Ask AI
import osos.environ["PHOENIX_API_KEY"] = "ADD YOUR PHOENIX API KEY"os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "ADD YOUR PHOENIX HOSTNAME"# If you created your Phoenix Cloud instance before June 24th, 2025,# you also need to set the API key as a header:# os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
Set the GOOGLE_API_KEY environment variable. Refer to Google’s ADK documentation for more details on authentication and environment variables.
Copy
Ask AI
export GOOGLE_API_KEY=[your_key_here]
Use the register function to connect your application to Phoenix.
Copy
Ask AI
from phoenix.otel import register# Configure the Phoenix tracertracer_provider = register( project_name="my-llm-app", # Default is 'default' auto_instrument=True # Auto-instrument your app based on installed OI dependencies)
Now that you have tracing setup, all Google ADK SDK requests will be streamed to Phoenix for observability and evaluation.
Copy
Ask AI
import asynciofrom google.adk.agents import Agentfrom google.adk.runners import InMemoryRunnerfrom google.genai import typesdef get_weather(city: str) -> dict: """Retrieves the current weather report for a specified city. Args: city (str): The name of the city for which to retrieve the weather report. Returns: dict: status and result or error msg. """ if city.lower() == "new york": return { "status": "success", "report": ( "The weather in New York is sunny with a temperature of 25 degrees" " Celsius (77 degrees Fahrenheit)." ), } else: return { "status": "error", "error_message": f"Weather information for '{city}' is not available.", }agent = Agent( name="test_agent", model="gemini-2.0-flash-exp", description="Agent to answer questions using tools.", instruction="You must use the available tools to find an answer.", tools=[get_weather])async def main(): app_name = "test_instrumentation" user_id = "test_user" session_id = "test_session" runner = InMemoryRunner(agent=agent, app_name=app_name) session_service = runner.session_service await session_service.create_session( app_name=app_name, user_id=user_id, session_id=session_id ) async for event in runner.run_async( user_id=user_id, session_id=session_id, new_message=types.Content(role="user", parts=[ types.Part(text="What is the weather in New York?")] ) ): if event.is_final_response(): print(event.content.parts[0].text.strip())if __name__ == "__main__": asyncio.run(main())
When using Vertex AI Agent Engine for remote deployment, instrumentation must be configured within the remote agent module, not in the main application code.
For Agent Engine deployment, include the instrumentation packages in your requirements and set up instrumentation in your agent module:Main Application:
from phoenix.otel import registerfrom openinference.instrumentation.google_adk import GoogleADKInstrumentor# Configure instrumentation within the remote agenttracer_provider = register( project_name="adk-agent",)GoogleADKInstrumentor().instrument(tracer_provider=tracer_provider)# Your agent code here...