Demonstrates the handoff tool pattern - where control is transferred from the LLM to a custom function.
This example creates a voice agent that:
- Starts as a normal conversational assistant
- When the user says "I'm ready to talk to myself", hands off to an echo function
- The echo function then repeats everything the user says with a configurable prefix
cd examples/echo
GEMINI_API_KEY=your-key uv run python main.pyThe echo tool is decorated with @handoff_tool, which means:
- When called, it takes over processing of all future events
- The LLM is no longer involved after the handoff
- The tool receives an
eventparameter for each subsequent user input
@handoff_tool
async def echo(ctx: ToolEnv, prefix: Annotated[str, "A prefix to add..."], event):
"""Echo the user's message back to them with a prefix."""
if isinstance(event, AgentHandedOff):
# Called once when handoff occurs
yield AgentSendText(text=f"Echo mode activated! I'll prefix everything with '{prefix}'")
return
if isinstance(event, UserTurnEnded):
# Called for each user message after handoff
for item in event.content:
if isinstance(item, UserTextSent):
yield AgentSendText(text=f"{prefix}: {item.content}")ctx: Injected by the system - provides access to the turn environmentprefix: Provided by the LLM when calling the tool - captured at handoff timeevent: Injected by the system - the current input event being processed
- Handoff tools: Transfer control from the LLM to custom logic
AgentHandedOff: Event received when the handoff first occursUserTurnEnded: Event received for subsequent user messages- Argument capture: LLM-provided arguments (like
prefix) are captured at handoff time and reused for all subsequent calls