Skip to content

Conversation

mydea
Copy link
Member

@mydea mydea commented Sep 10, 2025

This adds a better way to ensure we flush in node-based vercel functions.

Note that this does not work in edge functions, so we need to keep the vercelWaitUntil code around for this - but we can noop it in node and instead rely on the better way to handle this, I believe.

@chargome when you're back, can you verify if this makes sense? 😅

Closes #17567

@mydea mydea self-assigned this Sep 10, 2025
Copy link
Contributor

size-limit report 📦

Path Size % Change Change
@sentry/browser 24.17 kB - -
@sentry/browser - with treeshaking flags 22.75 kB - -
@sentry/browser (incl. Tracing) 40.14 kB - -
@sentry/browser (incl. Tracing, Replay) 78.5 kB - -
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 68.26 kB - -
@sentry/browser (incl. Tracing, Replay with Canvas) 83.18 kB +0.01% +1 B 🔺
@sentry/browser (incl. Tracing, Replay, Feedback) 95.38 kB - -
@sentry/browser (incl. Feedback) 40.91 kB - -
@sentry/browser (incl. sendFeedback) 28.82 kB - -
@sentry/browser (incl. FeedbackAsync) 33.77 kB - -
@sentry/react 25.89 kB - -
@sentry/react (incl. Tracing) 42.11 kB - -
@sentry/vue 28.66 kB +0.01% +1 B 🔺
@sentry/vue (incl. Tracing) 41.95 kB - -
@sentry/svelte 24.2 kB - -
CDN Bundle 25.76 kB - -
CDN Bundle (incl. Tracing) 40.01 kB - -
CDN Bundle (incl. Tracing, Replay) 76.29 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) 81.75 kB - -
CDN Bundle - uncompressed 75.2 kB - -
CDN Bundle (incl. Tracing) - uncompressed 118.31 kB - -
CDN Bundle (incl. Tracing, Replay) - uncompressed 233.4 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 246.16 kB - -
@sentry/nextjs (client) 44.14 kB - -
@sentry/sveltekit (client) 40.58 kB - -
@sentry/node-core 49.86 kB +0.07% +33 B 🔺
@sentry/node 152.26 kB +0.03% +37 B 🔺
@sentry/node - without tracing 92.26 kB +0.04% +33 B 🔺
@sentry/aws-serverless 105.75 kB +0.04% +35 B 🔺

View base workflow run

Copy link
Contributor

node-overhead report 🧳

Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.

Scenario Requests/s % of Baseline Prev. Requests/s Change %
GET Baseline 8,977 - 8,644 +4%
GET With Sentry 1,263 14% 1,235 +2%
GET With Sentry (error only) 5,894 66% 5,576 +6%
POST Baseline 1,188 - 1,170 +2%
POST With Sentry 465 39% 465 -
POST With Sentry (error only) 1,052 89% 1,018 +3%
MYSQL Baseline 3,332 - 3,187 +5%
MYSQL With Sentry 437 13% 377 +16%
MYSQL With Sentry (error only) 2,640 79% 2,566 +3%

View base workflow run

if (process.env.VERCEL) {
process.on('SIGTERM', async () => {
// We have 500ms for processing here, so we try to make sure to have enough time to send the events
await client.flush(200);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you test this on a vercel function? Generally makes sense but we could bump this more up I'd say?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did not test this so far at all, maybe you can take this PR over when you have some time and properly test it. The reason I went with 200 here is because this is just the timeout used for finishing events, we still need time to actually make the http request so there needs to be some wiggle room -to the max. 500ms - if we spend 500ms on processing events, we have no more time to actually send the events 😅

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yeah that makes sense I thought this time is included already, I'll take it over

@chargome chargome assigned chargome and unassigned mydea Sep 18, 2025
@mydea
Copy link
Member Author

mydea commented Sep 18, 2025

@chargome this PR may also fix this: #17689

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Use new process.on('SIGTERM') to flush on Vercel

2 participants