Skip to content

Commit 47a0ea6

Browse files
authored
chore: Prepare for 0.7.0 development (#1404)
1 parent de0be4b commit 47a0ea6

File tree

15 files changed

+17
-24
lines changed

15 files changed

+17
-24
lines changed

.github/actions/setup-spark-builder/action.yaml

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -26,10 +26,6 @@ inputs:
2626
description: 'The Apache Spark version (e.g., 3.4.3) to build'
2727
required: true
2828
default: '3.4.3'
29-
comet-version:
30-
description: 'The Comet version to use for Spark'
31-
required: true
32-
default: '0.5.0-SNAPSHOT'
3329
runs:
3430
using: "composite"
3531
steps:
@@ -46,7 +42,6 @@ runs:
4642
run: |
4743
cd apache-spark
4844
git apply ../dev/diffs/${{inputs.spark-version}}.diff
49-
../mvnw -nsu -q versions:set-property -Dproperty=comet.version -DnewVersion=${{inputs.comet-version}} -DgenerateBackupPoms=false
5045
5146
- name: Cache Maven dependencies
5247
uses: actions/cache@v4

.github/workflows/spark_sql_test.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,6 @@ jobs:
7171
with:
7272
spark-version: ${{ matrix.spark-version.full }}
7373
spark-short-version: ${{ matrix.spark-version.short }}
74-
comet-version: '0.6.0-SNAPSHOT' # TODO: get this from pom.xml
7574
- name: Run Spark tests
7675
run: |
7776
cd apache-spark

.github/workflows/spark_sql_test_ansi.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,6 @@ jobs:
6969
with:
7070
spark-version: ${{ matrix.spark-version.full }}
7171
spark-short-version: ${{ matrix.spark-version.short }}
72-
comet-version: '0.6.0-SNAPSHOT' # TODO: get this from pom.xml
7372
- name: Run Spark tests
7473
run: |
7574
cd apache-spark

common/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ under the License.
2626
<parent>
2727
<groupId>org.apache.datafusion</groupId>
2828
<artifactId>comet-parent-spark${spark.version.short}_${scala.binary.version}</artifactId>
29-
<version>0.6.0-SNAPSHOT</version>
29+
<version>0.7.0-SNAPSHOT</version>
3030
<relativePath>../pom.xml</relativePath>
3131
</parent>
3232

dev/diffs/3.4.3.diff

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ index d3544881af1..26ab186c65d 100644
77
<ivy.version>2.5.1</ivy.version>
88
<oro.version>2.0.8</oro.version>
99
+ <spark.version.short>3.4</spark.version.short>
10-
+ <comet.version>0.5.0-SNAPSHOT</comet.version>
10+
+ <comet.version>0.7.0-SNAPSHOT</comet.version>
1111
<!--
1212
If you changes codahale.metrics.version, you also need to change
1313
the link to metrics.dropwizard.io in docs/monitoring.md.

dev/diffs/3.5.1.diff

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ index 0f504dbee85..430ec217e59 100644
77
<ivy.version>2.5.1</ivy.version>
88
<oro.version>2.0.8</oro.version>
99
+ <spark.version.short>3.5</spark.version.short>
10-
+ <comet.version>0.5.0-SNAPSHOT</comet.version>
10+
+ <comet.version>0.7.0-SNAPSHOT</comet.version>
1111
<!--
1212
If you changes codahale.metrics.version, you also need to change
1313
the link to metrics.dropwizard.io in docs/monitoring.md.

dev/diffs/4.0.0-preview1.diff

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ index a4b1b2c3c9f..6a532749978 100644
77
<ivy.version>2.5.2</ivy.version>
88
<oro.version>2.0.8</oro.version>
99
+ <spark.version.short>4.0</spark.version.short>
10-
+ <comet.version>0.5.0-SNAPSHOT</comet.version>
10+
+ <comet.version>0.7.0-SNAPSHOT</comet.version>
1111
<!--
1212
If you change codahale.metrics.version, you also need to change
1313
the link to metrics.dropwizard.io in docs/monitoring.md.

docs/source/contributor-guide/debugging.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,7 @@ Then build the Comet as [described](https://github.com/apache/arrow-datafusion-c
130130
Start Comet with `RUST_BACKTRACE=1`
131131

132132
```console
133-
RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars spark/target/comet-spark-spark3.4_2.12-0.6.0-SNAPSHOT.jar --conf spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true --conf spark.comet.exec.enabled=true
133+
RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar --conf spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true --conf spark.comet.exec.enabled=true
134134
```
135135

136136
Get the expanded exception details

docs/source/user-guide/installation.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ See the [Comet Kubernetes Guide](kubernetes.md) guide.
7474
Make sure `SPARK_HOME` points to the same Spark version as Comet was built for.
7575

7676
```console
77-
export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.6.0-SNAPSHOT.jar
77+
export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar
7878

7979
$SPARK_HOME/bin/spark-shell \
8080
--jars $COMET_JAR \
@@ -130,7 +130,7 @@ explicitly contain Comet otherwise Spark may use a different class-loader for th
130130
components which will then fail at runtime. For example:
131131

132132
```
133-
--driver-class-path spark/target/comet-spark-spark3.4_2.12-0.6.0-SNAPSHOT.jar
133+
--driver-class-path spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar
134134
```
135135

136136
Some cluster managers may require additional configuration, see <https://spark.apache.org/docs/latest/cluster-overview.html>

fuzz-testing/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ Set appropriate values for `SPARK_HOME`, `SPARK_MASTER`, and `COMET_JAR` environ
6161
$SPARK_HOME/bin/spark-submit \
6262
--master $SPARK_MASTER \
6363
--class org.apache.comet.fuzz.Main \
64-
target/comet-fuzz-spark3.4_2.12-0.6.0-SNAPSHOT-jar-with-dependencies.jar \
64+
target/comet-fuzz-spark3.4_2.12-0.7.0-SNAPSHOT-jar-with-dependencies.jar \
6565
data --num-files=2 --num-rows=200 --exclude-negative-zero --generate-arrays --generate-structs --generate-maps
6666
```
6767

@@ -77,7 +77,7 @@ Generate random queries that are based on the available test files.
7777
$SPARK_HOME/bin/spark-submit \
7878
--master $SPARK_MASTER \
7979
--class org.apache.comet.fuzz.Main \
80-
target/comet-fuzz-spark3.4_2.12-0.6.0-SNAPSHOT-jar-with-dependencies.jar \
80+
target/comet-fuzz-spark3.4_2.12-0.7.0-SNAPSHOT-jar-with-dependencies.jar \
8181
queries --num-files=2 --num-queries=500
8282
```
8383

@@ -99,7 +99,7 @@ $SPARK_HOME/bin/spark-submit \
9999
--conf spark.driver.extraClassPath=$COMET_JAR \
100100
--conf spark.executor.extraClassPath=$COMET_JAR \
101101
--class org.apache.comet.fuzz.Main \
102-
target/comet-fuzz-spark3.4_2.12-0.6.0-SNAPSHOT-jar-with-dependencies.jar \
102+
target/comet-fuzz-spark3.4_2.12-0.7.0-SNAPSHOT-jar-with-dependencies.jar \
103103
run --num-files=2 --filename=queries.sql
104104
```
105105

0 commit comments

Comments
 (0)