-
Notifications
You must be signed in to change notification settings - Fork 172
improve performance of responses #991
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: 2629191 The changes in this PR will be included in the next version bump. This PR includes changesets to release 3 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
732dd74
to
123b9bd
Compare
123b9bd
to
ebc9148
Compare
commit: |
701cea3
to
49f24c7
Compare
This is still buffering all chunks and delivering them as one though right? We should definitely look at streaming without doing that (you only need to buffer until headers have been delivered, can stream after that). Eg, using Next's MockResponse: https://github.com/vercel/next.js/blob/v15.3.0-canary.13/packages/next/src/server/lib/mock-request.ts and then just wiring up (or using https://github.com/mhart/fetch-to-node or similar) |
Yes, technically we shouldn't store them in memory and stream them directly without copying/storing. |
well, splitting the |
controller.close(); | ||
}, | ||
}) | ||
: ReadableStream.from(res._chunks); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything in this repo is targeting Node 18 (including all esbuild configs and typescript configs), so I suspect this may require a major bump.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ugh, yep...
Welcome to Node.js v18.20.8.
Type ".help" for more information.
> ReadableStream.from
undefined
>
I have to assume this is why the current code was used.
We can avoid the semver-major bump with a bit more work tho, it would just require an impl like..
async function* gen() {
for (const chunk of _chunks) {
yield chunk;
}
}
const generator = gen();
new ReadableStream({
async pull(controller) {
const next = await generator.next();
if (next.done) {
controller.close();
} else {
controller.enqueue(next.value);
}
}
});
Or something along those lines. Bit more work. Bumping the major would be preferred, however, especially since 18 is so far behind.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Question tho... if this is targeting 18.x... is CI not targeting that?
Correct, see #992 |
0d1d790
to
2629191
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, Thanks a lot for that guys.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks all for this work on this 🎉
Awesome 👍 |
For benchmarks that check TTFB, opennextjs was performing poorly. This was due to several reasons such as returning the body as a huge chunk with unnecessary copies. To summarize:
I've also updated CI to use Node.js v20 since ReadableStream.from() requires it, but also v18 is EOL.
This improves the responses by roughly 20%.
PS: AI took my trust in pull-request descriptions with lists - but I wrote it regardless...