LLM output uniformizer for all providers #25813
mschoenb97IL
announced in
Ideas
Replies: 1 comment 1 reply
-
I think adding a utility makes the most sense, at least to start. Will open PR shortly, would love feedback! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Right now some of the message chunks from streaming output are in a different format for different LLM providers. For example, Anthropic gives a list of chunks while openai gives a string. Furthermore, llama models from
ChatBedrockConverse
have AIMessage content fields as lists rather than strings. Can these outputs either be made uniform, or can you expose a utility to uniformize them?Motivation
I want to be able to easily switch between providers without downstream processes breaking
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions