Skip to content

Conversation

@dojiong
Copy link
Contributor

@dojiong dojiong commented Jul 28, 2025

Fixes #

  • When the queue is full, spans are dropped but current_batch_size is incremented
  • However, when spans are later exported, current_batch_size is still decremented based on the actual number of spans exported

Changes

moving the current_batch_size.fetch_add to only execute when try_send succeeds

Merge requirement checklist

  • CONTRIBUTING guidelines followed
  • Unit tests added/updated (if applicable)
  • Appropriate CHANGELOG.md files updated for non-trivial, user-facing changes
  • Changes in public API reviewed (if applicable)

@dojiong dojiong requested a review from a team as a code owner July 28, 2025 13:58
@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Jul 28, 2025

CLA Signed

The committers listed above are authorized under a signed CLA.

  • ✅ login: dojiong / name: Lo (f4b856b)
  • ✅ login: cijothomas / name: Cijo Thomas (73aeae9)

@codecov
Copy link

codecov bot commented Jul 28, 2025

Codecov Report

❌ Patch coverage is 68.18182% with 7 lines in your changes missing coverage. Please review.
✅ Project coverage is 80.1%. Comparing base (42139cb) to head (f4b856b).

Files with missing lines Patch % Lines
opentelemetry-sdk/src/trace/span_processor.rs 68.1% 7 Missing ⚠️
Additional details and impacted files
@@          Coverage Diff          @@
##            main   #3089   +/-   ##
=====================================
  Coverage   80.1%   80.1%           
=====================================
  Files        126     126           
  Lines      21957   21958    +1     
=====================================
+ Hits       17603   17604    +1     
  Misses      4354    4354           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@dojiong dojiong changed the title Fix batch size accounting in BatchSpanProcessor when queue is full fix: batch size accounting in BatchSpanProcessor when queue is full Jul 28, 2025
@lalitb lalitb requested a review from Copilot July 28, 2025 16:55
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes a bug in the BatchSpanProcessor where the current batch size counter was incorrectly incremented even when spans were dropped due to a full queue, leading to inaccurate batch size accounting.

  • Moved the batch size increment logic to only execute when spans are successfully queued
  • Restructured the span queuing logic to use a match statement for clearer error handling
  • Added test verification for correct batch size tracking when spans are dropped
Comments suppressed due to low confidence (1)

opentelemetry-sdk/src/trace/span_processor.rs:569

  • There appears to be an extra closing brace. The function should end after the match statement without this additional closing brace.
    }

Copy link
Member

@lalitb lalitb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch. Thanks for fixing. Just verified the approach in log processor - which looks fine.

please update the Changelog.

@dojiong
Copy link
Contributor Author

dojiong commented Jul 29, 2025

Good catch. Thanks for fixing. Just verified the approach in log processor - which looks fine.

please update the Changelog.

updated

@cijothomas cijothomas merged commit 290f4bc into open-telemetry:main Aug 4, 2025
25 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants