Skip to content

Commit 45b46be

Browse files
committed
Merge branch 'main' into vector-bit-dot-product
2 parents 646f1b8 + 2ba9e9f commit 45b46be

File tree

3,112 files changed

+72472
-38600
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

3,112 files changed

+72472
-38600
lines changed

.buildkite/hooks/pre-command

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -94,6 +94,14 @@ if [[ "${USE_PROD_DOCKER_CREDENTIALS:-}" == "true" ]]; then
9494
fi
9595
fi
9696

97+
# Authenticate to the Docker Hub public read-only registry
98+
if which docker > /dev/null 2>&1; then
99+
DOCKERHUB_REGISTRY_USERNAME="$(vault read -field=username secret/ci/elastic-elasticsearch/docker_hub_public_ro_credentials)"
100+
DOCKERHUB_REGISTRY_PASSWORD="$(vault read -field=password secret/ci/elastic-elasticsearch/docker_hub_public_ro_credentials)"
101+
102+
echo "$DOCKERHUB_REGISTRY_PASSWORD" | docker login --username "$DOCKERHUB_REGISTRY_USERNAME" --password-stdin docker.io
103+
fi
104+
97105
if [[ "$BUILDKITE_AGENT_META_DATA_PROVIDER" != *"k8s"* ]]; then
98106
# Run in the background, while the job continues
99107
nohup .buildkite/scripts/setup-monitoring.sh </dev/null >/dev/null 2>&1 &

.editorconfig

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -226,5 +226,8 @@ indent_size = 2
226226
[*.{xsd,xml}]
227227
indent_size = 4
228228

229+
[verification-metadata.xml]
230+
indent_size = 3
231+
229232
[*.{csv,sql}-spec]
230233
trim_trailing_whitespace = false

.gitattributes

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,9 @@ x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/parser/EsqlBasePar
1313
x-pack/plugin/esql/src/main/generated/** linguist-generated=true
1414
x-pack/plugin/esql/src/main/generated-src/** linguist-generated=true
1515

16-
# ESQL functions docs are autogenerated. More information at `docs/reference/esql/functions/README.md`
17-
docs/reference/esql/functions/*/** linguist-generated=true
16+
# ESQL functions docs are autogenerated. More information at `docs/reference/query-languages/esql/README.md`
17+
docs/reference/query-languages/esql/_snippets/functions/*/** linguist-generated=true
18+
#docs/reference/query-languages/esql/_snippets/operators/*/** linguist-generated=true
19+
docs/reference/query-languages/esql/images/** linguist-generated=true
20+
docs/reference/query-languages/esql/kibana/** linguist-generated=true
1821

Could

Whitespace-only changes.

REST_API_COMPATIBILITY.md

Lines changed: 34 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -154,9 +154,11 @@ if (request.getRestApiVersion() == RestApiVersion.V_7 && request.hasParam("limit
154154

155155
The above code checks the request's compatible version and if the request has the parameter in question. In this case the deprecation warning is not automatic and requires the developer to manually log the warning. `request.param` is also required since it consumes the value as to avoid the error of unconsumed parameters.
156156

157-
### Testing
157+
### Testing Backwards Compatibility
158158

159-
The primary means of testing compatibility is via the prior major version's YAML REST tests. The build system will download the latest prior version of the YAML rest tests and execute them against the current cluster version. Prior to execution the tests will be transformed by injecting the correct headers to enable compatibility as well as other custom changes to the tests to allow the tests to pass. These customizations are configured via the build.gradle and happen just prior to test execution. Since the compatibility tests are manipulated version of the tests stored in Github (via the past major version), it is important to find the local (on disk) version for troubleshooting compatibility tests.
159+
The primary means of testing compatibility is via the prior major version's YAML REST tests. The build system will download the latest prior version of the YAML rest tests and execute them against the current cluster version. For example if you are testing main versioned as 9.0.0 the build system will download the yaml tests in the 8.x branch and execute those against the current cluster version for 9.0.0.
160+
161+
Prior to execution the tests will be transformed by injecting the correct headers to enable compatibility as well as other custom changes to the tests to allow the tests to pass. These customizations are configured via the build.gradle and happen just prior to test execution. Since the compatibility tests are manipulated version of the tests stored in Github (via the past major version), it is important to find the local (on disk) version for troubleshooting compatibility tests.
160162

161163
The tests are wired into the `check` task, so that is the easiest way to test locally prior to committing. More specifically the task is called `yamlRestCompatTest`. These behave nearly identical to it's non-compat `yamlRestTest` task. The only variance is that the tests are sourced from the prior version branch and the tests go through a transformation phase before execution. The transformation task is `yamlRestCompatTestTransform`.
162164

@@ -170,6 +172,36 @@ Since these are a variation of backward compatibility testing, the entire suite
170172

171173
In some cases the prior version of the YAML REST tests are not sufficient to fully test changes. This can happen when the prior version has insufficient test coverage. In those cases, you can simply add more testing to the prior version or you can add custom REST tests that will run along side of the other compatibility tests. These custom tests can be found in the `yamlRestCompatTest` sourceset. Custom REST tests for compatibility will not be modified prior to execution, so the correct headers need to be manually added.
172174

175+
#### Breaking Changes
176+
177+
It is possible to be in a state where you have intentionally made a breaking change and the compatibility tests will fail irrespective of checks for `skip` or `requires` cluster or test features in the current version such as 9.0.0. In this state, assuming the breaking changes are reasonable and agreed upon by the breaking change committee, the correct behavior is to skip the test in the `build.gradle` in 9.0.0. For example, if you make a breaking change that causes the `range/20_synthetic_source/Date range` to break then this test can be disabled temporarily in this file `rest-api-spec/build.gradle` like within this snippet:
178+
179+
```groovy
180+
tasks.named("yamlRestCompatTestTransform").configure({task ->
181+
task.skipTest("range/20_synthetic_source/Date range", "date range breaking change causes tests to produce incorrect values for compatibility")
182+
task.skipTest("indices.sort/10_basic/Index Sort", "warning does not exist for compatibility")
183+
task.skipTest("search/330_fetch_fields/Test search rewrite", "warning does not exist for compatibility")
184+
task.skipTestsByFilePattern("indices.create/synthetic_source*.yml", "@UpdateForV9 -> tests do not pass after bumping API version to 9 [ES-9597]")
185+
})
186+
```
187+
188+
When skipping a test temporarily in 9.0.0, we have to implement the proper `skip` and `requires` conditions to previous branches, such as 8.latest. After these conditions are implemented in 8.latest, you can re-enable the test in 9.0.0 by removing the `skipTest` condition.
189+
190+
The team implementing the changes can decide how to clean up or modify tests based on how breaking changes were backported. e.g.:
191+
192+
In 8.latest:
193+
194+
* Add `skip` / `requires` conditions to existing tests that check the old behavior. This prevents those tests from failing during backward compatibility or upgrade testing from 8.latest to 9.0.0
195+
196+
In 9.0.0:
197+
198+
* Add `requires` conditions for new tests that validate the updated API or output format
199+
* Add `skip` conditions for older tests that would break in 9.0.0
200+
201+
#### Test Features
202+
203+
Both cluster and test features exist. Cluster features are meant for new capability and test features can specifically be used to gate and manage `skip` and `requires` yaml test operations. For more information, see [Versioning.md](docs/internal/Versioning.md#cluster-features). When backporting and using these features they can not overlap in name and must be consistent when backported so that clusters built with these features are compatible.
204+
173205
### Developer's workflow
174206

175207
There should not be much, if any, deviation in a developers normal workflow to introduce and back-port changes. Changes should be applied in main, then back ported as needed.

benchmarks/build.gradle

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,6 @@ base {
2525
archivesName = 'elasticsearch-benchmarks'
2626
}
2727

28-
tasks.named("test").configure { enabled = false }
2928
tasks.named("javadoc").configure { enabled = false }
3029

3130
configurations {
@@ -52,8 +51,10 @@ dependencies {
5251
api "org.openjdk.jmh:jmh-core:$versions.jmh"
5352
annotationProcessor "org.openjdk.jmh:jmh-generator-annprocess:$versions.jmh"
5453
// Dependencies of JMH
55-
runtimeOnly 'net.sf.jopt-simple:jopt-simple:5.0.4'
54+
runtimeOnly 'net.sf.jopt-simple:jopt-simple:5.0.2'
5655
runtimeOnly 'org.apache.commons:commons-math3:3.6.1'
56+
57+
testImplementation(project(':test:framework'))
5758
}
5859

5960
// enable the JMH's BenchmarkProcessor to generate the final benchmark classes

benchmarks/src/main/java/org/elasticsearch/benchmark/compute/operator/EvalBenchmark.java

Lines changed: 40 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@
1111

1212
import org.apache.lucene.util.BytesRef;
1313
import org.elasticsearch.common.breaker.NoopCircuitBreaker;
14+
import org.elasticsearch.common.logging.LogConfigurator;
1415
import org.elasticsearch.common.settings.Settings;
1516
import org.elasticsearch.common.util.BigArrays;
1617
import org.elasticsearch.compute.data.Block;
@@ -23,11 +24,14 @@
2324
import org.elasticsearch.compute.data.DoubleVector;
2425
import org.elasticsearch.compute.data.LongBlock;
2526
import org.elasticsearch.compute.data.LongVector;
27+
import org.elasticsearch.compute.data.OrdinalBytesRefVector;
2628
import org.elasticsearch.compute.data.Page;
2729
import org.elasticsearch.compute.operator.DriverContext;
2830
import org.elasticsearch.compute.operator.EvalOperator;
2931
import org.elasticsearch.compute.operator.Operator;
3032
import org.elasticsearch.core.TimeValue;
33+
import org.elasticsearch.logging.LogManager;
34+
import org.elasticsearch.logging.Logger;
3135
import org.elasticsearch.xpack.esql.core.expression.Expression;
3236
import org.elasticsearch.xpack.esql.core.expression.FieldAttribute;
3337
import org.elasticsearch.xpack.esql.core.expression.FoldContext;
@@ -89,9 +93,16 @@ public class EvalBenchmark {
8993
static final DriverContext driverContext = new DriverContext(BigArrays.NON_RECYCLING_INSTANCE, blockFactory);
9094

9195
static {
96+
LogConfigurator.configureESLogging();
9297
// Smoke test all the expected values and force loading subclasses more like prod
98+
selfTest();
99+
}
100+
101+
static void selfTest() {
102+
Logger log = LogManager.getLogger(EvalBenchmark.class);
93103
try {
94104
for (String operation : EvalBenchmark.class.getField("operation").getAnnotationsByType(Param.class)[0].value()) {
105+
log.info("self testing {}", operation);
95106
run(operation);
96107
}
97108
} catch (NoSuchFieldException e) {
@@ -117,7 +128,9 @@ public class EvalBenchmark {
117128
"mv_min_ascending",
118129
"rlike",
119130
"to_lower",
120-
"to_upper" }
131+
"to_lower_ords",
132+
"to_upper",
133+
"to_upper_ords" }
121134
)
122135
public String operation;
123136

@@ -225,12 +238,12 @@ private static EvalOperator.ExpressionEvaluator evaluator(String operation) {
225238
RLike rlike = new RLike(Source.EMPTY, keywordField, new RLikePattern(".ar"));
226239
yield EvalMapper.toEvaluator(FOLD_CONTEXT, rlike, layout(keywordField)).get(driverContext);
227240
}
228-
case "to_lower" -> {
241+
case "to_lower", "to_lower_ords" -> {
229242
FieldAttribute keywordField = keywordField();
230243
ToLower toLower = new ToLower(Source.EMPTY, keywordField, configuration());
231244
yield EvalMapper.toEvaluator(FOLD_CONTEXT, toLower, layout(keywordField)).get(driverContext);
232245
}
233-
case "to_upper" -> {
246+
case "to_upper", "to_upper_ords" -> {
234247
FieldAttribute keywordField = keywordField();
235248
ToUpper toUpper = new ToUpper(Source.EMPTY, keywordField, configuration());
236249
yield EvalMapper.toEvaluator(FOLD_CONTEXT, toUpper, layout(keywordField)).get(driverContext);
@@ -404,13 +417,15 @@ private static void checkExpected(String operation, Page actual) {
404417
}
405418
}
406419
}
407-
case "to_lower" -> checkBytes(operation, actual, new BytesRef[] { new BytesRef("foo"), new BytesRef("bar") });
408-
case "to_upper" -> checkBytes(operation, actual, new BytesRef[] { new BytesRef("FOO"), new BytesRef("BAR") });
420+
case "to_lower" -> checkBytes(operation, actual, false, new BytesRef[] { new BytesRef("foo"), new BytesRef("bar") });
421+
case "to_lower_ords" -> checkBytes(operation, actual, true, new BytesRef[] { new BytesRef("foo"), new BytesRef("bar") });
422+
case "to_upper" -> checkBytes(operation, actual, false, new BytesRef[] { new BytesRef("FOO"), new BytesRef("BAR") });
423+
case "to_upper_ords" -> checkBytes(operation, actual, true, new BytesRef[] { new BytesRef("FOO"), new BytesRef("BAR") });
409424
default -> throw new UnsupportedOperationException(operation);
410425
}
411426
}
412427

413-
private static void checkBytes(String operation, Page actual, BytesRef[] expectedVals) {
428+
private static void checkBytes(String operation, Page actual, boolean expectOrds, BytesRef[] expectedVals) {
414429
BytesRef scratch = new BytesRef();
415430
BytesRefVector v = actual.<BytesRefBlock>getBlock(1).asVector();
416431
for (int i = 0; i < BLOCK_LENGTH; i++) {
@@ -420,6 +435,15 @@ private static void checkBytes(String operation, Page actual, BytesRef[] expecte
420435
throw new AssertionError("[" + operation + "] expected [" + expected + "] but was [" + b + "]");
421436
}
422437
}
438+
if (expectOrds) {
439+
if (v.asOrdinals() == null) {
440+
throw new IllegalArgumentException("expected ords but got " + v);
441+
}
442+
} else {
443+
if (v.asOrdinals() != null) {
444+
throw new IllegalArgumentException("expected non-ords but got " + v);
445+
}
446+
}
423447
}
424448

425449
private static Page page(String operation) {
@@ -500,6 +524,16 @@ private static Page page(String operation) {
500524
}
501525
yield new Page(builder.build().asBlock());
502526
}
527+
case "to_lower_ords", "to_upper_ords" -> {
528+
var bytes = blockFactory.newBytesRefVectorBuilder(BLOCK_LENGTH);
529+
bytes.appendBytesRef(new BytesRef("foo"));
530+
bytes.appendBytesRef(new BytesRef("bar"));
531+
var ordinals = blockFactory.newIntVectorFixedBuilder(BLOCK_LENGTH);
532+
for (int i = 0; i < BLOCK_LENGTH; i++) {
533+
ordinals.appendInt(i % 2);
534+
}
535+
yield new Page(new OrdinalBytesRefVector(ordinals.build(), bytes.build()).asBlock());
536+
}
503537
default -> throw new UnsupportedOperationException();
504538
};
505539
}
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
/*
2+
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
3+
* or more contributor license agreements. Licensed under the "Elastic License
4+
* 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side
5+
* Public License v 1"; you may not use this file except in compliance with, at
6+
* your election, the "Elastic License 2.0", the "GNU Affero General Public
7+
* License v3.0 only", or the "Server Side Public License, v 1".
8+
*/
9+
10+
package org.elasticsearch.benchmark.compute.operator;
11+
12+
import org.elasticsearch.test.ESTestCase;
13+
14+
public class EvalBenchmarkTests extends ESTestCase {
15+
public void testSelfTest() {
16+
EvalBenchmark.selfTest();
17+
}
18+
}

build-conventions/settings.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
*/
99

1010
plugins {
11-
id "com.gradle.develocity" version "3.18.1"
11+
id "com.gradle.develocity" version "3.19.2"
1212
}
1313

1414
rootProject.name = 'build-conventions'

build-tools-internal/settings.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ pluginManagement {
99
}
1010

1111
plugins {
12-
id "com.gradle.develocity" version "3.18.1"
12+
id "com.gradle.develocity" version "3.19.2"
1313
}
1414

1515
dependencyResolutionManagement {

0 commit comments

Comments
 (0)