Skip to content

Commit 7369772

Browse files
committed
config values
1 parent 55e97ea commit 7369772

File tree

3 files changed

+7
-11
lines changed

3 files changed

+7
-11
lines changed

README.md

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ http://spark.apache.org/downloads.html. Or, you can create a distribution from
2020
source code using `make-distribution.sh`. For example:
2121

2222
```
23-
$ git clone git@github.com:apache/spark.git
23+
$ https://github.com/apache/spark.git
2424
$ cd spark
2525
$ ./dev/make-distribution.sh --tgz \
2626
-Phadoop-2.7 -Pkubernetes -Pkinesis-asl -Phive -Phive-thriftserver
@@ -40,8 +40,6 @@ invoked if the `integration-test` phase is run.
4040
With Maven, the integration test can be run using the following command:
4141

4242
```
43-
$ mvn clean pre-integration-test \
44-
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz
4543
$ mvn clean integration-test \
4644
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz
4745
```
@@ -52,9 +50,7 @@ In order to run against any cluster, use the following:
5250
```sh
5351
$ mvn clean integration-test \
5452
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
55-
-DextraScalaTestArgs="-Dspark.kubernetes.test.master=k8s://https://<master> \
56-
-Dspark.docker.test.driverImage=<driver-image> \
57-
-Dspark.docker.test.executorImage=<executor-image>
53+
-DextraScalaTestArgs="-Dspark.kubernetes.test.master=k8s://https://<master>
5854
```
5955
6056
# Specify existing docker images via image:tag
@@ -70,5 +66,5 @@ Here is an example:
7066
```
7167
$ mvn clean integration-test \
7268
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
73-
"-Dspark.kubernetes.test.imageDockerTag=latest"
69+
-Dspark.kubernetes.test.imageDockerTag=latest
7470
```

integration-test/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,9 +54,9 @@ private[spark] class KubernetesSuite extends FunSuite with BeforeAndAfterAll wit
5454
before {
5555
sparkAppConf = kubernetesTestComponents.newSparkAppConf()
5656
.set("spark.kubernetes.driver.label.spark-app-locator", APP_LOCATOR_LABEL)
57-
.set(INIT_CONTAINER_DOCKER_IMAGE, tagImage("spark-init"))
5857
.set(DRIVER_DOCKER_IMAGE, tagImage("spark-driver"))
5958
.set(EXECUTOR_DOCKER_IMAGE, tagImage("spark-executor"))
59+
.set(INIT_CONTAINER_DOCKER_IMAGE, tagImage("spark-init"))
6060
kubernetesTestComponents.createNamespace()
6161
}
6262

integration-test/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/config.scala

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ package org.apache.spark.deploy.k8s.integrationtest
1818

1919
package object config {
2020
val KUBERNETES_TEST_DOCKER_TAG_SYSTEM_PROPERTY = "spark.kubernetes.test.imageDockerTag"
21-
val DRIVER_DOCKER_IMAGE = "spark.kubernetes.driver.docker.image"
22-
val EXECUTOR_DOCKER_IMAGE = "spark.kubernetes.executor.docker.image"
23-
val INIT_CONTAINER_DOCKER_IMAGE = "spark.kubernetes.initcontainer.docker.image"
21+
val DRIVER_DOCKER_IMAGE = "spark.kubernetes.driver.container.image"
22+
val EXECUTOR_DOCKER_IMAGE = "spark.kubernetes.executor.container.image"
23+
val INIT_CONTAINER_DOCKER_IMAGE = "spark.kubernetes.initcontainer.container.image"
2424
}

0 commit comments

Comments
 (0)