-
-
Notifications
You must be signed in to change notification settings - Fork 21
Open
Labels
enhancementNew feature or requestNew feature or request⨠feature requestNew feature requestNew feature request
Description
π¦ Duck Enhancement Proposal
π‘ The Problem
All responses are currently buffered β the client waits until every LLM finishes before seeing anything. For duck_council (all providers), duck_debate (multi-round), and duck_iterate (back-and-forth), this means long waits with zero feedback.
π Proposed Solution
Stream partial results as each duck completes its response, rather than waiting for all ducks to finish.
What to stream:
compare_ducks/duck_councilβ emit each provider's response as it arrivesduck_debateβ emit each round as it completesduck_iterateβ emit each refinement step- Single-duck tools (
ask_duck,chat_with_duck) β stream token-by-token
Technical notes:
- OpenAI SDK already supports streaming (
stream: true) β currently hardcoded tofalseinDuckProvider - MCP spec supports progress notifications for long-running operations
- The upcoming multi-turn SSE transport SEP will further improve streaming support
π Related
- feat: Long-Running Tasks support (MCP 2025 spec)Β #15 (Long-Running Tasks) β complementary feature
- MCP roadmap: multi-turn SSE transport is an active SEP
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request⨠feature requestNew feature requestNew feature request