-
Notifications
You must be signed in to change notification settings - Fork 600
fix: batch size accounting in BatchSpanProcessor when queue is full #3089
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #3089 +/- ##
=====================================
Coverage 80.1% 80.1%
=====================================
Files 126 126
Lines 21957 21958 +1
=====================================
+ Hits 17603 17604 +1
Misses 4354 4354 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR fixes a bug in the BatchSpanProcessor where the current batch size counter was incorrectly incremented even when spans were dropped due to a full queue, leading to inaccurate batch size accounting.
- Moved the batch size increment logic to only execute when spans are successfully queued
- Restructured the span queuing logic to use a match statement for clearer error handling
- Added test verification for correct batch size tracking when spans are dropped
Comments suppressed due to low confidence (1)
opentelemetry-sdk/src/trace/span_processor.rs:569
- There appears to be an extra closing brace. The function should end after the match statement without this additional closing brace.
}
lalitb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch. Thanks for fixing. Just verified the approach in log processor - which looks fine.
please update the Changelog.
updated |
Fixes #
current_batch_sizeis incrementedcurrent_batch_sizeis still decremented based on the actual number of spans exportedChanges
moving the current_batch_size.fetch_add to only execute when try_send succeeds
Merge requirement checklist
CHANGELOG.mdfiles updated for non-trivial, user-facing changes