-
Notifications
You must be signed in to change notification settings - Fork 34
feat: Support auto-streaming with langchain #1271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
b556127 to
3b40abe
Compare
| // Todo: Extend BaseChatModelParams? | ||
| this.streaming = | ||
| (langchainOptions as { streaming?: boolean })?.streaming ?? false; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The streaming-option is supported here, but I felt like extending or replacing BaseChatModelParams might be considered a breaking change which leads to any use needing to avoid type-checks.
| finalOutput = | ||
| finalOutput !== undefined ? finalOutput.concat(chunk) : chunk; | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A snapshot test directly on the final output is not possible here, because langgraph will add uuids to the response object, but other tests should already be verifying the general execution with auto-streaming enabled.
df681d3 to
259f0da
Compare
758eb8f to
cf39b95
Compare
cf39b95 to
fd45bf3
Compare
ZhongpinWang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had a first look at the openai client. Please check my comments and if needed adapt the changes also to the orchestration client.
|
@KavithaSiva Please also have a look at this PR since there are some behaviour changes about streaming and you might wanna know. 😀 |
ZhongpinWang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the changes. 👍 I added some more comments.
| // - `streaming`: true enables auto-streaming in `invoke()` calls | ||
| // - `disableStreaming`: true overrides streaming flag | ||
| // - `streaming`: `false` causes `disableStreaming` to be set to `true` for framework compatibility | ||
| this.disableStreaming = fields.disableStreaming ?? this.disableStreaming; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[req] The difference between your implementation and https://github.com/langchain-ai/langchainjs/blob/cc502e1b67dbadcd123a7ea2964c791c2bbad581/libs/providers/langchain-openai/src/chat_models/base.ts#L479 is that openai always set the value to false if not strictly equal true. Your solution using ?? might be problematic if user writes plain JS and anything such as '', 0 other than undefined is assigned to the left.
Also your left can potentially be undefined if it was previously undefined. And later in line 82, if the if condition does not apply, this.disableStreaming will be undefined forever, which is different from the OpenAI behaviour. (Although https://github.com/langchain-ai/langchainjs/blob/cc502e1b67dbadcd123a7ea2964c791c2bbad581/libs/langchain-core/src/language_models/chat_models.ts#L220 here it is set to false by default by LangChain, so not a real issue but hope you see my point about yours and OpenAI's implementation technically being different. Be careful about ?? in TS and I personally find bar = foo === true a smart way for making sure bar will either be true or false and also excluding all other truthy values).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I replaced all ?? instances in the relevant code with boolean equality checks.
ZhongpinWang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor suggestions
ZhongpinWang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for implementing this nice feature. Let's merge it!
ZhongpinWang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh before we merge, can you add a changeset for the changes?
|
I've added two changesets. A previous changeset was lost during squashing/rebasing, but I've expanded it to match the extended scope of the PR. |
23d4db5 to
aa167c5
Compare
ZhongpinWang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
Context
Closes SAP/ai-sdk-js-backlog#344.
What this PR does and why it is needed
This PR adds basic support for passing
streaming: trueto the constructors and saving it as an attribute. Results are collected via.concat(). In addition this PR updates the unit tests, sample code and e2e tests.