Skip to content

feat: support backoff/retry in OTLP #3126

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 17 commits into
base: main
Choose a base branch
from

Conversation

scottgerring
Copy link
Contributor

@scottgerring scottgerring commented Aug 12, 2025

This isn't ready for review! Opening PR early for convenience.

Fixes #3081, building on the work started by @AaronRM 🤝

Changes

A new retry module added to opentelemetry-sdk

Models the sorts of retry an operation may request (retry / can't retry / throttle), and provides a helper retry_with_backoff mechanism that can be used to wrap up a retryable operation and retry it. The helper relies on experimental_async_runtime for its runtime abstraction, to provide the actual pausing. It also takes a lambda to classify the error, so the caller can inform the retry mechanism if a retry is required.

A new retry_classification module added to opentelemetry-otlp

This bit takes the actual error responses that we get back over OTLP and maps them back to the retry model. Because this is OTLP-specific stuff it belongs here rather than alongside the retry code.

Retry binding

... happens in each one of the concrete exporters to tie it all together.

Open Questions

  • Do we want user configurable retry behaviour - e.g. max retry count?
  • Do we want to cap max wait duration, and if so, where?
  • Do we want to make the HTTP exporters depend on experimental_async_runtime always, like the gRPC exporters do?
    • Is this feature really still experimental?

Merge requirement checklist

  • CONTRIBUTING guidelines followed
  • Unit tests added/updated (if applicable)
  • Appropriate CHANGELOG.md files updated for non-trivial, user-facing changes
  • Changes in public API reviewed (if applicable)

Copy link

codecov bot commented Aug 12, 2025

Codecov Report

❌ Patch coverage is 70.62871% with 939 lines in your changes missing coverage. Please review.
✅ Project coverage is 80.2%. Comparing base (ad88615) to head (1a1ac61).
⚠️ Report is 195 commits behind head on main.

Files with missing lines Patch % Lines
opentelemetry-sdk/src/metrics/data/mod.rs 13.4% 154 Missing ⚠️
opentelemetry-otlp/src/exporter/http/logs.rs 0.0% 84 Missing ⚠️
opentelemetry-otlp/src/exporter/http/metrics.rs 0.0% 84 Missing ⚠️
opentelemetry-otlp/src/exporter/http/trace.rs 0.0% 84 Missing ⚠️
opentelemetry-sdk/src/metrics/mod.rs 87.0% 73 Missing ⚠️
opentelemetry-proto/src/transform/metrics.rs 11.1% 64 Missing ⚠️
opentelemetry-otlp/src/exporter/http/mod.rs 80.9% 63 Missing ⚠️
...-sdk/src/metrics/internal/exponential_histogram.rs 65.1% 52 Missing ⚠️
opentelemetry-otlp/src/exporter/tonic/metrics.rs 0.0% 49 Missing ⚠️
opentelemetry-otlp/src/exporter/tonic/trace.rs 0.0% 46 Missing ⚠️
... and 31 more
Additional details and impacted files
@@           Coverage Diff           @@
##            main   #3126     +/-   ##
=======================================
+ Coverage   79.6%   80.2%   +0.6%     
=======================================
  Files        124     128      +4     
  Lines      23174   22884    -290     
=======================================
- Hits       18456   18368     -88     
+ Misses      4718    4516    -202     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@scottgerring scottgerring force-pushed the feat/retry-logic branch 3 times, most recently from db6ed67 to 3847b26 Compare August 12, 2025 10:20
otel_debug!(name: "TonicLogsClient.ExportFailed", error = &error);
Err(OTelSdkError::InternalFailure(error))
}
let batch = Arc::new(batch);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see if we can avoid this

@@ -35,6 +35,7 @@ tracing = {workspace = true, optional = true}

prost = { workspace = true, optional = true }
tonic = { workspace = true, optional = true }
tonic-types = { workspace = true, optional = true }
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needed for gRPC error type

impl LogExporter for OtlpHttpClient {
#[cfg(feature = "http-retry")]
async fn export(&self, batch: LogBatch<'_>) -> OTelSdkResult {
let policy = RetryPolicy {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we create a default retry policy and share it between all exporters?

.map_err(|e| OTelSdkError::InternalFailure(e.message))
}

#[cfg(not(feature = "http-retry"))]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a massive duplication of code.

If we decide that http export always has retry behaviour, which means we include the unstable runtime feature, then we can remove all of this.

Alternatively we can provide export once, with all the extra pomp and fanfare to support retry, and then just use the stub "don't actually retry" impl. There would be some slight runtime overhead to this, but the codebase would be much simpler.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment above applies to the other HTTP exporters also.

@scottgerring scottgerring changed the title [not ready!] feat: support backoff/retry feat: support backoff/retry in OTLP Aug 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

OTLP Stabilization: Throttling & Retry
2 participants