Skip to content

Commit b3583fb

Browse files
kimoonkimfoxish
authored andcommitted
Remove unnecessary copy dockerfiles step (#9)
1 parent f8a9dec commit b3583fb

File tree

3 files changed

+6
-29
lines changed

3 files changed

+6
-29
lines changed

README.md

Lines changed: 2 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -27,11 +27,7 @@ top-level dir. For more details, see the related section in
2727
[building-spark.md](https://github.com/apache/spark/blob/master/docs/building-spark.md#building-a-runnable-distribution)
2828

2929

30-
The integration tests also need a local path to the directory that
31-
contains `Dockerfile`s. In the main spark repo, the path is
32-
`/spark/resource-managers/kubernetes/docker/src/main/dockerfiles`.
33-
34-
Once you prepare the inputs, the integration tests can be executed with Maven or
30+
Once you prepare the tarball, the integration tests can be executed with Maven or
3531
your IDE. Note that when running tests from an IDE, the `pre-integration-test`
3632
phase must be run every time the Spark main code changes. When running tests
3733
from the command line, the `pre-integration-test` phase should automatically be
@@ -41,8 +37,7 @@ With Maven, the integration test can be run using the following command:
4137

4238
```
4339
$ mvn clean integration-test \
44-
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
45-
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
40+
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz
4641
```
4742

4843
# Running against an arbitrary cluster
@@ -51,7 +46,6 @@ In order to run against any cluster, use the following:
5146
```sh
5247
$ mvn clean integration-test \
5348
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
54-
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
5549
-DextraScalaTestArgs="-Dspark.kubernetes.test.master=k8s://https://<master> -Dspark.docker.test.driverImage=<driver-image> -Dspark.docker.test.executorImage=<executor-image>"
5650
```
5751

@@ -67,7 +61,6 @@ property `spark.docker.test.persistMinikube` to the test process:
6761
```
6862
$ mvn clean integration-test \
6963
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
70-
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
7164
-DextraScalaTestArgs=-Dspark.docker.test.persistMinikube=true
7265
```
7366

@@ -85,6 +78,5 @@ is an example:
8578
```
8679
$ mvn clean integration-test \
8780
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
88-
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
8981
"-DextraScalaTestArgs=-Dspark.docker.test.persistMinikube=true -Dspark.docker.test.skipBuildImages=true"
9082
```

integration-test/pom.xml

Lines changed: 0 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -139,22 +139,6 @@
139139
</arguments>
140140
</configuration>
141141
</execution>
142-
<execution>
143-
<!-- TODO: Remove this hack once the upstream is fixed -->
144-
<id>copy-dockerfiles-if-missing</id>
145-
<phase>pre-integration-test</phase>
146-
<goals>
147-
<goal>exec</goal>
148-
</goals>
149-
<configuration>
150-
<workingDirectory>${project.build.directory}/spark-distro</workingDirectory>
151-
<executable>/bin/sh</executable>
152-
<arguments>
153-
<argument>-c</argument>
154-
<argument>test -d dockerfiles || cp -pr ${spark-dockerfiles-dir} dockerfiles</argument>
155-
</arguments>
156-
</configuration>
157-
</execution>
158142
</executions>
159143
</plugin>
160144
<plugin>

integration-test/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/docker/SparkDockerImageBuilder.scala

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,9 +32,10 @@ private[spark] class SparkDockerImageBuilder
3232

3333
private val DOCKER_BUILD_PATH = SPARK_DISTRO_PATH
3434
// Dockerfile paths must be relative to the build path.
35-
private val BASE_DOCKER_FILE = "dockerfiles/spark-base/Dockerfile"
36-
private val DRIVER_DOCKER_FILE = "dockerfiles/driver/Dockerfile"
37-
private val EXECUTOR_DOCKER_FILE = "dockerfiles/executor/Dockerfile"
35+
private val DOCKERFILES_DIR = "kubernetes/dockerfiles/"
36+
private val BASE_DOCKER_FILE = DOCKERFILES_DIR + "spark-base/Dockerfile"
37+
private val DRIVER_DOCKER_FILE = DOCKERFILES_DIR + "driver/Dockerfile"
38+
private val EXECUTOR_DOCKER_FILE = DOCKERFILES_DIR + "executor/Dockerfile"
3839
private val TIMEOUT = PatienceConfiguration.Timeout(Span(2, Minutes))
3940
private val INTERVAL = PatienceConfiguration.Interval(Span(2, Seconds))
4041
private val dockerHost = dockerEnv.getOrElse("DOCKER_HOST",

0 commit comments

Comments
 (0)