Skip to content

Fix gemini/anthropic on_llm_end payload#197

Merged
veithly merged 2 commits intoXSpoonAi:mainfrom
helloissariel:stream
Dec 3, 2025
Merged

Fix gemini/anthropic on_llm_end payload#197
veithly merged 2 commits intoXSpoonAi:mainfrom
helloissariel:stream

Conversation

@helloissariel
Copy link
Copy Markdown
Contributor

  • Fix gemini_provider streaming end hook to emit LLMResponse instead of LLMResponseChunk (avoids missing delta validation errors)
  • Align anthropic_provider on_llm_end behavior with the same fix for consistency
  • Preserve token usage data and per-token callbacks; only the final callback payload changes

@veithly veithly merged commit 00951f9 into XSpoonAi:main Dec 3, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants