Skip to content

Changes to support Gemini Live in Pipecat Flows#3620

Open
kompfner wants to merge 11 commits intomainfrom
pk/flows-gemini-live-support
Open

Changes to support Gemini Live in Pipecat Flows#3620
kompfner wants to merge 11 commits intomainfrom
pk/flows-gemini-live-support

Conversation

@kompfner
Copy link
Contributor

@kompfner kompfner commented Feb 2, 2026

Please read commit messages for change descriptions.

Pipecat Flows companion PR: pipecat-ai/pipecat-flows#232

@kompfner kompfner force-pushed the pk/flows-gemini-live-support branch 2 times, most recently from 47940fd to d39a670 Compare February 2, 2026 21:28
- Detect when a newly-received context warrants a reconnection to the Gemini Live API; we need to reconnect in order to re-seed new conversation history or swap out the current set of tools. This reconnection occurs when Pipecat Flows transitions between conversational nodes, as context has been edited and/or tools added/removed.
- Strip function call and response messages out of context before sending to Gemini Live when seeding conversation history, to sidestep a seeming Gemini Live limitation (see https://stackoverflow.com/a/79851394)
@codecov
Copy link

codecov bot commented Feb 3, 2026

Codecov Report

❌ Patch coverage is 68.30601% with 58 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/pipecat/services/google/gemini_live/llm.py 12.12% 29 Missing ⚠️
src/pipecat/adapters/services/gemini_adapter.py 20.83% 19 Missing ⚠️
...t/processors/aggregators/llm_response_universal.py 70.83% 7 Missing ⚠️
src/pipecat/processors/aggregators/llm_context.py 97.14% 2 Missing ⚠️
src/pipecat/frames/frames.py 83.33% 1 Missing ⚠️
Files with missing lines Coverage Δ
src/pipecat/adapters/schemas/tools_schema.py 92.85% <100.00%> (+6.19%) ⬆️
src/pipecat/frames/frames.py 88.90% <83.33%> (+0.08%) ⬆️
src/pipecat/processors/aggregators/llm_context.py 70.31% <97.14%> (+15.91%) ⬆️
...t/processors/aggregators/llm_response_universal.py 77.17% <70.83%> (-0.40%) ⬇️
src/pipecat/adapters/services/gemini_adapter.py 46.80% <20.83%> (-2.81%) ⬇️
src/pipecat/services/google/gemini_live/llm.py 19.32% <12.12%> (-0.41%) ⬇️

... and 19 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

…hange to the context is newly-appended messages, send them to the server.

This requires us to distinguish between newly-appended "bookkeeping" messages that just reflect what Gemini Live already said, and messages that were programmatically inserted, such as from the the transition to the new Pipecat Flows node.

This change makes it so that using `LLMMessagesAppendFrame` will have the desired effect, of updating the Gemini Live conversation.
@kompfner kompfner force-pushed the pk/flows-gemini-live-support branch from be6612e to 44b917c Compare February 3, 2026 20:45
…ing a model response twice when there is a Flows node transition that returns a value from the transition function and loads new context messages in the new node
… triggering a model response twice when there is a Flows node transition that returns a value from the transition function and loads new context messages in the new node"

This reverts commit 90e6f0d.
…he case of programmatically-appended context messages, do a full reconnection—I've tried many things to get "live updates" to context working reliably and have been unable to (Gemini code comments also warn against doing "live updates" after the initial context seeding prior to starting audio input).
…trip function call and response messages out of context before sending to Gemini Live when seeding conversation history, which we were doing to sidestep a seeming Gemini Live limitation (see https://stackoverflow.com/a/79851394), convert them to regular text messages with special formatting
@@ -0,0 +1 @@
- Added support to Gemini Live for programmatically swapping tools or editing context at runtime; now you can use `LLMMessagesAppendFrame`, `LLMMessagesUpdateFrame`, and `LLMSetToolsFrame` with Gemini Live, like you would with text-to-text services. Importantly, this change enables you to use Gemini Live with Pipecat Flows, if you're using Pipecat Flows 0.0.23 or above.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: update the Pipecat Flows version mentioned here if it turns out the necessary Flows changes don't make it in that version.

@kompfner kompfner force-pushed the pk/flows-gemini-live-support branch from 2d6fcd2 to c49ee86 Compare February 5, 2026 16:25
@kompfner kompfner force-pushed the pk/flows-gemini-live-support branch from c49ee86 to e039abd Compare February 5, 2026 16:28
@kompfner kompfner marked this pull request as ready for review February 5, 2026 18:32
…ing context in a frame-based way.

The previous approach required the caller to directly grab a reference to the context object, grab a "snapshot" of its messages *at that point in time*, transform the messages, and then push an `LLMMessagesUpdateFrame` with the transformed messages. This approach can lead to problems: what if there had already been a change to the context queued in the pipeline? The transformed messages would simply overwrite it without consideration.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant