Skip to content

Commit 2d534bd

Browse files
committed
[KYUUBI #7284] Upgrade Flink and Spark to latest patched version
### Why are the changes needed? Test with latest patched version. Flink 1.19.1 => 1.19.3 Flink 1.20.0 => 1.20.3 Spark 3.4.3 => 3.4.4 Spark 3.5.5 => 3.5.7 ### How was this patch tested? Pass GHA. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #7284 from pan3793/version-bump. Closes #7284 8a04411 [Cheng Pan] Upgrade Flink and Spark to latest patched version Authored-by: Cheng Pan <chengpan@apache.org> Signed-off-by: Cheng Pan <chengpan@apache.org>
1 parent 0ef4201 commit 2d534bd

File tree

9 files changed

+20
-20
lines changed

9 files changed

+20
-20
lines changed

.github/workflows/master.yml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ jobs:
8282
- java: 8
8383
python: '3.9'
8484
spark: '3.5'
85-
spark-archive: '-Dspark.archive.mirror=https://www.apache.org/dyn/closer.lua/spark/spark-3.4.3 -Dspark.archive.name=spark-3.4.3-bin-hadoop3.tgz -Pzookeeper-3.6'
85+
spark-archive: '-Dspark.archive.mirror=https://www.apache.org/dyn/closer.lua/spark/spark-3.4.4 -Dspark.archive.name=spark-3.4.4-bin-hadoop3.tgz -Pzookeeper-3.6'
8686
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PaimonTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
8787
comment: 'verify-on-spark-3.4-binary'
8888
- java: 17
@@ -277,7 +277,7 @@ jobs:
277277
comment: 'verify-on-flink-1.18-binary'
278278
- java: 8
279279
flink: '1.20'
280-
flink-archive: '-Dflink.archive.mirror=https://www.apache.org/dyn/closer.lua/flink/flink-1.19.1 -Dflink.archive.name=flink-1.19.1-bin-scala_2.12.tgz'
280+
flink-archive: '-Dflink.archive.mirror=https://www.apache.org/dyn/closer.lua/flink/flink-1.19.3 -Dflink.archive.name=flink-1.19.3-bin-scala_2.12.tgz'
281281
comment: 'verify-on-flink-1.19-binary'
282282
steps:
283283
- uses: actions/checkout@v4
@@ -439,14 +439,14 @@ jobs:
439439
cache-binary: false
440440
- name: Pull Spark image
441441
run: |
442-
docker pull apache/spark:3.5.5
442+
docker pull apache/spark:3.5.7
443443
- name: Build Kyuubi Docker Image
444444
uses: docker/build-push-action@v6
445445
with:
446446
# passthrough CI into build container
447447
build-args: |
448448
CI=${CI}
449-
BASE_IMAGE=apache/spark:3.5.5
449+
BASE_IMAGE=apache/spark:3.5.7
450450
MVN_ARG=--spark-provided --flink-provided --hive-provided
451451
context: .
452452
file: build/Dockerfile.CI
@@ -463,7 +463,7 @@ jobs:
463463
# https://minikube.sigs.k8s.io/docs/handbook/pushing/#7-loading-directly-to-in-cluster-container-runtime
464464
minikube image load apache/kyuubi:ci
465465
# pre-install spark into minikube
466-
minikube image load apache/spark:3.5.5
466+
minikube image load apache/spark:3.5.7
467467
- name: kubectl pre-check
468468
run: |
469469
kubectl get nodes

bin/docker-image-tool.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -181,8 +181,8 @@ Examples:
181181
$0 -r docker.io/myrepo -t v1.8.1 build
182182
$0 -r docker.io/myrepo -t v1.8.1 push
183183
184-
- Build and push with tag "v1.8.1" and Spark-3.5.5 as base image to docker.io/myrepo
185-
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.5 build
184+
- Build and push with tag "v1.8.1" and Spark-3.5.7 as base image to docker.io/myrepo
185+
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.7 build
186186
$0 -r docker.io/myrepo -t v1.8.1 push
187187
188188
- Build and push for multiple archs to docker.io/myrepo

docker/playground/.env

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ KYUUBI_HADOOP_VERSION=3.3.6
2424
POSTGRES_VERSION=12
2525
POSTGRES_JDBC_VERSION=42.3.4
2626
SCALA_BINARY_VERSION=2.12
27-
SPARK_VERSION=3.4.3
27+
SPARK_VERSION=3.4.4
2828
SPARK_BINARY_VERSION=3.4
2929
SPARK_HADOOP_VERSION=3.3.4
3030
ZOOKEEPER_VERSION=3.6.3

docs/deployment/kyuubi_on_kubernetes.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,8 +42,8 @@ Examples:
4242
$0 -r docker.io/myrepo -t v1.8.1 build
4343
$0 -r docker.io/myrepo -t v1.8.1 push
4444

45-
- Build and push with tag "v1.8.1" and Spark-3.5.5 as base image to docker.io/myrepo
46-
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.5 build
45+
- Build and push with tag "v1.8.1" and Spark-3.5.7 as base image to docker.io/myrepo
46+
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.7 build
4747
$0 -r docker.io/myrepo -t v1.8.1 push
4848

4949
- Build and push for multiple archs to docker.io/myrepo

docs/extensions/engines/spark/lineage.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ Sometimes, it may be incompatible with other Spark distributions, then you may n
117117
For example,
118118

119119
```shell
120-
build/mvn clean package -pl :kyuubi-spark-lineage_2.12 -am -DskipTests -Dspark.version=3.5.5
120+
build/mvn clean package -pl :kyuubi-spark-lineage_2.12 -am -DskipTests -Dspark.version=3.5.7
121121
```
122122

123123
The available `spark.version`s are shown in the following table.

extensions/spark/kyuubi-spark-connector-hive/src/main/scala/org/apache/kyuubi/spark/connector/hive/HiveConnectorUtils.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ object HiveConnectorUtils extends Logging {
122122
isSplitable,
123123
maxSplitBytes,
124124
partitionValues)
125-
}.recover { case _: Exception => // SPARK-51185: Spark 3.5.5
125+
}.recover { case _: Exception => // SPARK-51185: Spark 3.5.7
126126
val fileStatusWithMetadataClz = DynClasses.builder()
127127
.impl("org.apache.spark.sql.execution.datasources.FileStatusWithMetadata")
128128
.buildChecked()

integration-tests/kyuubi-kubernetes-it/src/test/scala/org/apache/kyuubi/kubernetes/test/deployment/KyuubiOnKubernetesTestsSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ class KyuubiOnKubernetesWithSparkTestsBase extends WithKyuubiServerOnKubernetes
5656
Map(
5757
"spark.master" -> s"k8s://$miniKubeApiMaster",
5858
// We should update spark docker image in ./github/workflows/master.yml at the same time
59-
"spark.kubernetes.container.image" -> "apache/spark:3.5.5",
59+
"spark.kubernetes.container.image" -> "apache/spark:3.5.7",
6060
"spark.kubernetes.container.image.pullPolicy" -> "IfNotPresent",
6161
"spark.executor.memory" -> "512M",
6262
"spark.driver.memory" -> "1024M",

integration-tests/kyuubi-kubernetes-it/src/test/scala/org/apache/kyuubi/kubernetes/test/spark/SparkOnKubernetesTestsSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ abstract class SparkOnKubernetesSuiteBase
5151
// TODO Support more Spark version
5252
// Spark official docker image: https://hub.docker.com/r/apache/spark/tags
5353
KyuubiConf().set("spark.master", s"k8s://$apiServerAddress")
54-
.set("spark.kubernetes.container.image", "apache/spark:3.5.5")
54+
.set("spark.kubernetes.container.image", "apache/spark:3.5.7")
5555
.set("spark.kubernetes.container.image.pullPolicy", "IfNotPresent")
5656
.set("spark.executor.instances", "1")
5757
.set("spark.executor.memory", "512M")

pom.xml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -141,7 +141,7 @@
141141
<failsafe.verion>3.3.2</failsafe.verion>
142142
<fb303.version>0.9.3</fb303.version>
143143
<flexmark.version>0.62.2</flexmark.version>
144-
<flink.version>1.20.0</flink.version>
144+
<flink.version>1.20.3</flink.version>
145145
<flink.archive.name>flink-${flink.version}-bin-scala_2.12.tgz</flink.archive.name>
146146
<flink.archive.mirror>${apache.archive.dist}/flink/flink-${flink.version}</flink.archive.mirror>
147147
<flink.archive.query>?action=download</flink.archive.query>
@@ -207,7 +207,7 @@
207207
DO NOT forget to change the following properties when change the minor version of Spark:
208208
`delta.version`, `delta.artifact`, `maven.plugin.scalatest.exclude.tags`
209209
-->
210-
<spark.version>3.5.5</spark.version>
210+
<spark.version>3.5.7</spark.version>
211211
<spark.binary.version>3.5</spark.binary.version>
212212
<spark.archive.scala.suffix></spark.archive.scala.suffix>
213213
<spark.archive.name>spark-${spark.version}-bin-hadoop3${spark.archive.scala.suffix}.tgz</spark.archive.name>
@@ -2012,7 +2012,7 @@
20122012
<module>extensions/spark/kyuubi-spark-connector-hive</module>
20132013
</modules>
20142014
<properties>
2015-
<spark.version>3.4.3</spark.version>
2015+
<spark.version>3.4.4</spark.version>
20162016
<spark.binary.version>3.4</spark.binary.version>
20172017
<delta.version>2.4.0</delta.version>
20182018
<delta.artifact>delta-core_${scala.binary.version}</delta.artifact>
@@ -2027,7 +2027,7 @@
20272027
<module>extensions/spark/kyuubi-spark-connector-hive</module>
20282028
</modules>
20292029
<properties>
2030-
<spark.version>3.5.5</spark.version>
2030+
<spark.version>3.5.7</spark.version>
20312031
<spark.binary.version>3.5</spark.binary.version>
20322032
<delta.version>3.3.1</delta.version>
20332033
<delta.artifact>delta-spark_${scala.binary.version}</delta.artifact>
@@ -2125,14 +2125,14 @@
21252125
<profile>
21262126
<id>flink-1.19</id>
21272127
<properties>
2128-
<flink.version>1.19.1</flink.version>
2128+
<flink.version>1.19.3</flink.version>
21292129
</properties>
21302130
</profile>
21312131

21322132
<profile>
21332133
<id>flink-1.20</id>
21342134
<properties>
2135-
<flink.version>1.20.0</flink.version>
2135+
<flink.version>1.20.3</flink.version>
21362136
</properties>
21372137
</profile>
21382138

0 commit comments

Comments
 (0)