-
Notifications
You must be signed in to change notification settings - Fork 25.5k
Aggs: Fix CB on reduction phase #133398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Aggs: Fix CB on reduction phase #133398
Changes from all commits
Commits
Show all changes
31 commits
Select commit
Hold shift + click to select a range
86f2d88
Fix wrong reduce memory estimate calculation
ivancea e798aa0
Add batched merge results size estimations to CB
ivancea 7b3d88d
Merge branch 'main' into aggs-composite-memory
ivancea 87ccf54
Merge branch 'main' into aggs-composite-memory
ivancea 6fc8fa8
Merge branch 'main' into aggs-composite-memory
ivancea 51fe8fc
Merge branch 'main' into aggs-composite-memory
ivancea 41b86a0
Updated javadocs on estimates and replaced 1.5-size with 0.5
ivancea eb46c13
Merge branch 'main' into aggs-composite-memory
ivancea 8270935
Initial CB tests for batched aggs
ivancea ba3748b
Merge branch 'main' into aggs-composite-memory
ivancea 2140240
Merge branch 'main' into aggs-composite-memory
ivancea d2fed37
Randomize tests
ivancea 449203f
Add estimate for field merge result
ivancea 7c7d3eb
Merge branch 'main' into aggs-composite-memory
ivancea 7402133
Merge branch 'main' into aggs-composite-memory
ivancea d344228
Added test cluster docs
ivancea f9b8379
Merge branch 'main' into aggs-composite-memory
ivancea 0a89991
Added Integration test for the reduce phase CB
ivancea 5f8a5cf
Add nullable
ivancea e8fda32
[CI] Auto commit changes from spotless
e173654
Fixed memory leak and added Repeat to test
ivancea 3d0f864
Removed comment
ivancea c9101bb
Fixed comment
ivancea ac0929b
Decrement ref on search response in test
ivancea 77b152b
Simplify test and add a random number of nodes to make it fail in bot…
ivancea e6d7beb
Supress repeat forbidden api error
ivancea 09aff51
Merge branch 'main' into aggs-composite-memory
ivancea 8e4fd41
Update docs/changelog/133398.yaml
ivancea a24eba1
Integrate test repetition in test case and remove unused setting
ivancea 1581fc4
Merge branch 'main' into aggs-composite-memory
ivancea f10f1a5
Merge branch 'main' into aggs-composite-memory
ivancea File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
pr: 133398 | ||
summary: "Aggs: Fix CB on reduction phase" | ||
area: Aggregations | ||
type: bug | ||
issues: [] |
156 changes: 156 additions & 0 deletions
156
...est/java/org/elasticsearch/aggregations/bucket/AggregationReductionCircuitBreakingIT.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,156 @@ | ||
/* | ||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one | ||
* or more contributor license agreements. Licensed under the "Elastic License | ||
* 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side | ||
* Public License v 1"; you may not use this file except in compliance with, at | ||
* your election, the "Elastic License 2.0", the "GNU Affero General Public | ||
* License v3.0 only", or the "Server Side Public License, v 1". | ||
*/ | ||
|
||
package org.elasticsearch.aggregations.bucket; | ||
|
||
import org.elasticsearch.ExceptionsHelper; | ||
import org.elasticsearch.action.DocWriteRequest; | ||
import org.elasticsearch.action.index.IndexRequestBuilder; | ||
import org.elasticsearch.action.search.SearchPhaseExecutionException; | ||
import org.elasticsearch.action.search.SearchRequestBuilder; | ||
import org.elasticsearch.aggregations.AggregationIntegTestCase; | ||
import org.elasticsearch.common.breaker.CircuitBreakingException; | ||
import org.elasticsearch.common.settings.Settings; | ||
import org.elasticsearch.search.aggregations.bucket.composite.TermsValuesSourceBuilder; | ||
import org.elasticsearch.test.ESIntegTestCase; | ||
import org.elasticsearch.xcontent.XContentBuilder; | ||
import org.elasticsearch.xcontent.XContentFactory; | ||
|
||
import java.io.IOException; | ||
import java.util.ArrayList; | ||
import java.util.List; | ||
import java.util.function.Consumer; | ||
import java.util.function.Supplier; | ||
import java.util.stream.IntStream; | ||
import java.util.stream.LongStream; | ||
|
||
import static org.elasticsearch.indices.breaker.HierarchyCircuitBreakerService.REQUEST_CIRCUIT_BREAKER_LIMIT_SETTING; | ||
import static org.elasticsearch.indices.breaker.HierarchyCircuitBreakerService.REQUEST_CIRCUIT_BREAKER_TYPE_SETTING; | ||
import static org.elasticsearch.indices.breaker.HierarchyCircuitBreakerService.TOTAL_CIRCUIT_BREAKER_LIMIT_SETTING; | ||
import static org.elasticsearch.indices.breaker.HierarchyCircuitBreakerService.USE_REAL_MEMORY_USAGE_SETTING; | ||
import static org.elasticsearch.node.NodeRoleSettings.NODE_ROLES_SETTING; | ||
import static org.elasticsearch.search.aggregations.AggregationBuilders.composite; | ||
import static org.elasticsearch.search.aggregations.AggregationBuilders.topHits; | ||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; | ||
import static org.hamcrest.Matchers.containsString; | ||
import static org.hamcrest.Matchers.instanceOf; | ||
|
||
@ESIntegTestCase.ClusterScope(minNumDataNodes = 1, maxNumDataNodes = 2, numClientNodes = 1) | ||
public class AggregationReductionCircuitBreakingIT extends AggregationIntegTestCase { | ||
@Override | ||
protected Settings nodeSettings(int nodeOrdinal, Settings otherSettings) { | ||
// Most of the settings here exist to make the search as stable and deterministic as possible | ||
var settings = Settings.builder() | ||
.put(super.nodeSettings(nodeOrdinal, otherSettings)) | ||
.put(REQUEST_CIRCUIT_BREAKER_TYPE_SETTING.getKey(), "memory") | ||
// More threads may lead to more consumption and the test failing in the datanodes | ||
.put("thread_pool.search.size", 1); | ||
if (NODE_ROLES_SETTING.get(otherSettings).isEmpty()) { | ||
// Coordinator | ||
settings.put(REQUEST_CIRCUIT_BREAKER_LIMIT_SETTING.getKey(), "7MB"); | ||
} else { | ||
// Datanode | ||
// To avoid OOMs | ||
settings.put(USE_REAL_MEMORY_USAGE_SETTING.getKey(), true).put(TOTAL_CIRCUIT_BREAKER_LIMIT_SETTING.getKey(), "80%"); | ||
} | ||
return settings.build(); | ||
} | ||
|
||
/** | ||
* Expect the breaker to trip in `QueryPhaseResultConsume#reduce()`, when reducing `MergeResult`s. | ||
* <p> | ||
* After testing this, the agg serialized size is around 5MB. | ||
* The CB is set to 7MB, as the reduction is expected to add an extra 50% overhead. | ||
* </p> | ||
*/ | ||
public void testCBTrippingOnReduction() throws IOException { | ||
createIndex(); | ||
addDocs(100, 100, 100); | ||
|
||
// Some leaks (Check ESTestCase#loggedLeaks) aren't logged unless we run the test twice. | ||
// So we run it multiple times to ensure everything gets collected before the final test checks. | ||
for (int i = 0; i < 10; i++) { | ||
assertCBTrip( | ||
() -> internalCluster().coordOnlyNodeClient() | ||
.prepareSearch("index") | ||
.setSize(0) | ||
.addAggregation( | ||
composite( | ||
"composite", | ||
List.of( | ||
new TermsValuesSourceBuilder("integer").field("integer"), | ||
new TermsValuesSourceBuilder("long").field("long") | ||
) | ||
).size(5000).subAggregation(topHits("top_hits").size(10)) | ||
) | ||
.setBatchedReduceSize(randomIntBetween(2, 5)), | ||
e -> { | ||
var completeException = ExceptionsHelper.stackTrace(e); | ||
// If a shard fails, we can't check reduction | ||
assumeTrue(completeException, e.shardFailures().length == 0); | ||
assertThat(e.getCause(), instanceOf(CircuitBreakingException.class)); | ||
assertThat(completeException, containsString("QueryPhaseResultConsumer.reduce")); | ||
} | ||
); | ||
} | ||
} | ||
|
||
public void assertCBTrip(Supplier<SearchRequestBuilder> requestSupplier, Consumer<SearchPhaseExecutionException> exceptionCallback) { | ||
try { | ||
requestSupplier.get().get().decRef(); | ||
|
||
fail("Expected the breaker to trip"); | ||
} catch (SearchPhaseExecutionException e) { | ||
exceptionCallback.accept(e); | ||
} | ||
} | ||
|
||
private void createIndex() throws IOException { | ||
XContentBuilder mappingBuilder = XContentFactory.jsonBuilder(); | ||
mappingBuilder.startObject(); | ||
mappingBuilder.startObject("properties"); | ||
{ | ||
mappingBuilder.startObject("integer"); | ||
mappingBuilder.field("type", "integer"); | ||
mappingBuilder.endObject(); | ||
} | ||
{ | ||
mappingBuilder.startObject("long"); | ||
mappingBuilder.field("type", "long"); | ||
mappingBuilder.endObject(); | ||
} | ||
|
||
mappingBuilder.endObject(); // properties | ||
mappingBuilder.endObject(); | ||
|
||
assertAcked( | ||
prepareCreate("index").setSettings(Settings.builder().put("index.number_of_shards", randomIntBetween(1, 10)).build()) | ||
.setMapping(mappingBuilder) | ||
); | ||
} | ||
|
||
private void addDocs(int docCount, int integerFieldMvCount, int longFieldMvCount) throws IOException { | ||
List<IndexRequestBuilder> docs = new ArrayList<>(); | ||
for (int i = 0; i < docCount; i++) { | ||
XContentBuilder docSource = XContentFactory.jsonBuilder(); | ||
docSource.startObject(); | ||
final int docNumber = i; | ||
List<Integer> integerValues = IntStream.range(0, integerFieldMvCount).map(x -> docNumber + x * 100).boxed().toList(); | ||
List<Long> longValues = LongStream.range(0, longFieldMvCount).map(x -> docNumber + x * 100).boxed().toList(); | ||
docSource.field("integer", integerValues); | ||
docSource.field("long", longValues); | ||
docSource.endObject(); | ||
|
||
docs.add(prepareIndex("index").setOpType(DocWriteRequest.OpType.CREATE).setSource(docSource)); | ||
} | ||
indexRandom(true, false, false, false, docs); | ||
forceMerge(false); | ||
flushAndRefresh("index"); | ||
} | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.