-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Remove batcher and related config in favor of sending queue #42767
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 9 commits
6128da3
459afae
6c6b9c5
44aa720
c88d9c6
2b8caa5
1465da8
5e6e752
bff9dbb
2869467
3bb6e62
fa5bd51
f9a5f30
1aec3c7
41d1a4e
eaaa5c3
4b51441
4e694b9
d5c3685
b93b35c
9ec3412
adcd6ce
72f2e10
382a623
1db0f2b
a86f49a
9f3c30e
19e215f
6717fc3
fe36c03
cd1be4f
19af7b0
51747c2
37b48f5
76b8c2d
95c08ad
10ff125
d0ac698
e4462c6
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -46,8 +46,8 @@ func createDefaultConfig() component.Config { | |
| qs.QueueSize = 10 | ||
| qs.Batch = configoptional.Some(exporterhelper.BatchConfig{ | ||
|
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I enabled batching by default, but I am a bit conflicted on this. There were 2 main reasons that I considered for making this decision:
The main issue with having batching by default is that the configoptional doesn't support it that well (see open-telemetry/opentelemetry-collector#13894) - basically, if default batching is enabled, then it is not possible to disable batching. We could reach a behavior closer to no batching by setting
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Maybe we could enable it by default for backwards compatibilit, but add a feature gate to phase out that behaviour? i.e. the feature gate would make it disabled by default.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. not able to disable batching is unfortunate. What's the desired end state here? To have sending queue and batching both enabled by default, or not?
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
This means we will give up on being able to disable batching - currently, it cannot be achieved
Sounds reasonable if we can accept not being able to disable the batching behaviour in default mode.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
But we would by setting the featuregate, right? So there's still an option, and we preserve the same default behaviour in the transition.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If we keep the min_size as 0 and flush interval close to 1ns - wouldn't it be close to no batching ? Can we guide users in setting that instead of gates? ++ on providing better defaults optimized for performance as it would be ideal for anyone using these the exporter for benchmarking etc.
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
@axw @carsonip Based on the discussions in open-telemetry/opentelemetry-collector#13894, I think keeping the batching as a default behaviour is a better option for defaults (as per the reasoning in my original comment). Eventually, upstream is going to have a feature to disable configoptional configs and our concern should be resolved. Meanwhile, we have 2 options:
WDYT? Any preferences here?
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. My vote is for option 2 to reduce the friction and also to keep Batching the default behaviour.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. (2) should be fine as I don't think it'll impact a lot of use cases.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm good with option 2. |
||
| FlushTimeout: 10 * time.Second, | ||
| MinSize: 5e+6, | ||
| MaxSize: 10e+6, | ||
| MinSize: 1e+6, | ||
| MaxSize: 5e+6, | ||
| Sizer: exporterhelper.RequestSizerTypeBytes, | ||
| }) | ||
|
|
||
|
|
@@ -220,19 +220,11 @@ func exporterhelperOptions( | |
| ) []exporterhelper.Option { | ||
| // not setting capabilities as they will default to non-mutating but will be updated | ||
| // by the base-exporter to mutating if batching is enabled. | ||
| opts := []exporterhelper.Option{ | ||
| return []exporterhelper.Option{ | ||
| exporterhelper.WithStart(start), | ||
| exporterhelper.WithShutdown(shutdown), | ||
| exporterhelper.WithQueueBatch(cfg.QueueBatchConfig, qbs), | ||
| xexporterhelper.WithQueueBatch(cfg.QueueBatchConfig, qbs), | ||
| // Effectively disable timeout_sender because timeout is enforced in bulk indexer. | ||
| exporterhelper.WithTimeout(exporterhelper.TimeoutConfig{Timeout: 0}), | ||
| } | ||
|
|
||
| // Effectively disable timeout_sender because timeout is enforced in bulk indexer. | ||
| // | ||
| // We keep timeout_sender enabled in the async mode (sending_queue not enabled OR sending | ||
| // queue enabled but batching not enabled OR based on the deprecated batcher setting), to | ||
| // ensure sending data to the background workers will not block indefinitely. | ||
| if cfg.QueueBatchConfig.Enabled { | ||
| opts = append(opts, exporterhelper.WithTimeout(exporterhelper.TimeoutConfig{Timeout: 0})) | ||
| } | ||
| return opts | ||
| } | ||
Uh oh!
There was an error while loading. Please reload this page.