Skip to content

Commit f4bde1c

Browse files
authored
chore: Change default Spark version to 3.5 (#1620)
1 parent e238392 commit f4bde1c

File tree

9 files changed

+27
-26
lines changed

9 files changed

+27
-26
lines changed

.github/actions/setup-spark-builder/action.yaml

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,13 +19,11 @@ name: Setup Spark Builder
1919
description: 'Setup Apache Spark to run SQL tests'
2020
inputs:
2121
spark-short-version:
22-
description: 'The Apache Spark short version (e.g., 3.4) to build'
22+
description: 'The Apache Spark short version (e.g., 3.5) to build'
2323
required: true
24-
default: '3.4'
2524
spark-version:
26-
description: 'The Apache Spark version (e.g., 3.4.3) to build'
25+
description: 'The Apache Spark version (e.g., 3.5.5) to build'
2726
required: true
28-
default: '3.4.3'
2927
runs:
3028
using: "composite"
3129
steps:

.github/workflows/docker-publish.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,6 +73,6 @@ jobs:
7373
with:
7474
platforms: linux/amd64,linux/arm64
7575
push: true
76-
tags: ghcr.io/apache/datafusion-comet:spark-3.4-scala-2.12-${{ env.COMET_VERSION }}
76+
tags: ghcr.io/apache/datafusion-comet:spark-3.5-scala-2.12-${{ env.COMET_VERSION }}
7777
file: kube/Dockerfile
7878
no-cache: true

docs/source/contributor-guide/debugging.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,7 @@ Then build the Comet as [described](https://github.com/apache/arrow-datafusion-c
130130
Start Comet with `RUST_BACKTRACE=1`
131131

132132
```console
133-
RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar --conf spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true --conf spark.comet.exec.enabled=true
133+
RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar --conf spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true --conf spark.comet.exec.enabled=true
134134
```
135135

136136
Get the expanded exception details

docs/source/contributor-guide/development.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -109,15 +109,15 @@ The tests can be run with:
109109

110110
```sh
111111
export SPARK_HOME=`pwd`
112-
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -nsu test
112+
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -Pspark-3.4 -nsu test
113113
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -Pspark-3.5 -nsu test
114114
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -Pspark-4.0 -nsu test
115115
```
116116

117117
and
118118
```sh
119119
export SPARK_HOME=`pwd`
120-
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" -nsu test
120+
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" -Pspark-3.4 -nsu test
121121
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" -Pspark-3.5 -nsu test
122122
./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" -Pspark-4.0 -nsu test
123123
```
@@ -127,7 +127,7 @@ To regenerate the golden files, you can run the following commands.
127127

128128
```sh
129129
export SPARK_HOME=`pwd`
130-
SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -nsu test
130+
SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -Pspark-3.4 -nsu test
131131
SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -Pspark-3.5 -nsu test
132132
SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark -Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -Pspark-4.0 -nsu test
133133
```

docs/source/user-guide/datasources.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,12 +51,12 @@ Unlike to native Comet reader the Datafusion reader fully supports nested types
5151
To build Comet with native DataFusion reader and remote HDFS support it is required to have a JDK installed
5252

5353
Example:
54-
Build a Comet for `spark-3.4` provide a JDK path in `JAVA_HOME`
54+
Build a Comet for `spark-3.5` provide a JDK path in `JAVA_HOME`
5555
Provide the JRE linker path in `RUSTFLAGS`, the path can vary depending on the system. Typically JRE linker is a part of installed JDK
5656

5757
```shell
5858
export JAVA_HOME="/opt/homebrew/opt/openjdk@11"
59-
make release PROFILES="-Pspark-3.4" COMET_FEATURES=hdfs RUSTFLAGS="-L $JAVA_HOME/libexec/openjdk.jdk/Contents/Home/lib/server"
59+
make release PROFILES="-Pspark-3.5" COMET_FEATURES=hdfs RUSTFLAGS="-L $JAVA_HOME/libexec/openjdk.jdk/Contents/Home/lib/server"
6060
```
6161

6262
Start Comet with experimental reader and HDFS support as [described](installation.md/#run-spark-shell-with-comet-enabled)

docs/source/user-guide/installation.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ See the [Comet Kubernetes Guide](kubernetes.md) guide.
8585
Make sure `SPARK_HOME` points to the same Spark version as Comet was built for.
8686

8787
```shell
88-
export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
88+
export COMET_JAR=spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar
8989

9090
$SPARK_HOME/bin/spark-shell \
9191
--jars $COMET_JAR \
@@ -141,7 +141,7 @@ explicitly contain Comet otherwise Spark may use a different class-loader for th
141141
components which will then fail at runtime. For example:
142142

143143
```
144-
--driver-class-path spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
144+
--driver-class-path spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar
145145
```
146146

147147
Some cluster managers may require additional configuration, see <https://spark.apache.org/docs/latest/cluster-overview.html>

docs/source/user-guide/source.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ cd apache-datafusion-comet-$COMET_VERSION
3838
Build
3939

4040
```console
41-
make release-nogit PROFILES="-Pspark-3.4"
41+
make release-nogit PROFILES="-Pspark-3.5"
4242
```
4343

4444
## Building from the GitHub repository
@@ -53,17 +53,17 @@ Build Comet for a specific Spark version:
5353

5454
```console
5555
cd datafusion-comet
56-
make release PROFILES="-Pspark-3.4"
56+
make release PROFILES="-Pspark-3.5"
5757
```
5858

5959
Note that the project builds for Scala 2.12 by default but can be built for Scala 2.13 using an additional profile:
6060

6161
```console
62-
make release PROFILES="-Pspark-3.4 -Pscala-2.13"
62+
make release PROFILES="-Pspark-3.5 -Pscala-2.13"
6363
```
6464

6565
To build Comet from the source distribution on an isolated environment without an access to `github.com` it is necessary to disable `git-commit-id-maven-plugin`, otherwise you will face errors that there is no access to the git during the build process. In that case you may use:
6666

6767
```console
68-
make release-nogit PROFILES="-Pspark-3.4"
68+
make release-nogit PROFILES="-Pspark-3.5"
6969
```

kube/Dockerfile

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
# limitations under the License.
1616
#
1717

18-
FROM apache/spark:3.4.3 AS builder
18+
FROM apache/spark:3.5.5 AS builder
1919

2020
USER root
2121

@@ -28,7 +28,7 @@ RUN apt update \
2828
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
2929
ENV PATH="/root/.cargo/bin:${PATH}"
3030
ENV RUSTFLAGS="-C debuginfo=line-tables-only -C incremental=false"
31-
ENV SPARK_VERSION=3.4
31+
ENV SPARK_VERSION=3.5
3232
ENV SCALA_VERSION=2.12
3333

3434
# copy source files to Docker image
@@ -61,8 +61,8 @@ RUN mkdir -p /root/.m2 && \
6161
RUN cd /comet \
6262
&& JAVA_HOME=$(readlink -f $(which javac) | sed "s/\/bin\/javac//") make release-nogit PROFILES="-Pspark-$SPARK_VERSION -Pscala-$SCALA_VERSION"
6363

64-
FROM apache/spark:3.4.3
65-
ENV SPARK_VERSION=3.4
64+
FROM apache/spark:3.5.5
65+
ENV SPARK_VERSION=3.5
6666
ENV SCALA_VERSION=2.12
6767
USER root
6868

pom.xml

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -47,13 +47,13 @@ under the License.
4747
<java.version>11</java.version>
4848
<maven.compiler.source>${java.version}</maven.compiler.source>
4949
<maven.compiler.target>${java.version}</maven.compiler.target>
50-
<scala.version>2.12.17</scala.version>
50+
<scala.version>2.12.18</scala.version>
5151
<scala.binary.version>2.12</scala.binary.version>
5252
<scala.plugin.version>4.7.2</scala.plugin.version>
5353
<scalatest.version>3.2.16</scalatest.version>
5454
<scalatest-maven-plugin.version>2.2.0</scalatest-maven-plugin.version>
55-
<spark.version>3.4.3</spark.version>
56-
<spark.version.short>3.4</spark.version.short>
55+
<spark.version>3.5.5</spark.version>
56+
<spark.version.short>3.5</spark.version.short>
5757
<spark.maven.scope>provided</spark.maven.scope>
5858
<protobuf.version>3.25.5</protobuf.version>
5959
<parquet.version>1.13.1</parquet.version>
@@ -64,7 +64,7 @@ under the License.
6464
<spotless.version>2.43.0</spotless.version>
6565
<jacoco.version>0.8.11</jacoco.version>
6666
<semanticdb.version>4.8.8</semanticdb.version>
67-
<slf4j.version>2.0.6</slf4j.version>
67+
<slf4j.version>2.0.7</slf4j.version>
6868
<guava.version>33.2.1-jre</guava.version>
6969
<jni.dir>${project.basedir}/../native/target/debug</jni.dir>
7070
<platform>darwin</platform>
@@ -97,7 +97,7 @@ under the License.
9797
</extraJavaTestArgs>
9898
<argLine>-ea -Xmx4g -Xss4m ${extraJavaTestArgs}</argLine>
9999
<shims.majorVerSrc>spark-3.x</shims.majorVerSrc>
100-
<shims.minorVerSrc>spark-3.4</shims.minorVerSrc>
100+
<shims.minorVerSrc>spark-3.5</shims.minorVerSrc>
101101
</properties>
102102

103103
<dependencyManagement>
@@ -555,8 +555,11 @@ under the License.
555555
<id>spark-3.4</id>
556556
<properties>
557557
<scala.version>2.12.17</scala.version>
558+
<spark.version>3.4.3</spark.version>
558559
<spark.version.short>3.4</spark.version.short>
559560
<parquet.version>1.13.1</parquet.version>
561+
<slf4j.version>2.0.6</slf4j.version>
562+
<shims.minorVerSrc>spark-3.4</shims.minorVerSrc>
560563
</properties>
561564
</profile>
562565

0 commit comments

Comments
 (0)