Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 2 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,7 @@ top-level dir. For more details, see the related section in
[building-spark.md](https://github.com/apache/spark/blob/master/docs/building-spark.md#building-a-runnable-distribution)


The integration tests also need a local path to the directory that
contains `Dockerfile`s. In the main spark repo, the path is
`/spark/resource-managers/kubernetes/docker/src/main/dockerfiles`.

Once you prepare the inputs, the integration tests can be executed with Maven or
Once you prepare the tarball, the integration tests can be executed with Maven or
your IDE. Note that when running tests from an IDE, the `pre-integration-test`
phase must be run every time the Spark main code changes. When running tests
from the command line, the `pre-integration-test` phase should automatically be
Expand All @@ -41,8 +37,7 @@ With Maven, the integration test can be run using the following command:

```
$ mvn clean integration-test \
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz
```

# Running against an arbitrary cluster
Expand All @@ -51,7 +46,6 @@ In order to run against any cluster, use the following:
```sh
$ mvn clean integration-test \
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
-DextraScalaTestArgs="-Dspark.kubernetes.test.master=k8s://https://<master> -Dspark.docker.test.driverImage=<driver-image> -Dspark.docker.test.executorImage=<executor-image>"
```

Expand All @@ -67,7 +61,6 @@ property `spark.docker.test.persistMinikube` to the test process:
```
$ mvn clean integration-test \
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
-DextraScalaTestArgs=-Dspark.docker.test.persistMinikube=true
```

Expand All @@ -85,6 +78,5 @@ is an example:
```
$ mvn clean integration-test \
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz \
-Dspark-dockerfiles-dir=spark/resource-managers/kubernetes/docker/src/main/dockerfiles
"-DextraScalaTestArgs=-Dspark.docker.test.persistMinikube=true -Dspark.docker.test.skipBuildImages=true"
```
16 changes: 0 additions & 16 deletions integration-test/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -139,22 +139,6 @@
</arguments>
</configuration>
</execution>
<execution>
<!-- TODO: Remove this hack once the upstream is fixed -->
<id>copy-dockerfiles-if-missing</id>
<phase>pre-integration-test</phase>
<goals>
<goal>exec</goal>
</goals>
<configuration>
<workingDirectory>${project.build.directory}/spark-distro</workingDirectory>
<executable>/bin/sh</executable>
<arguments>
<argument>-c</argument>
<argument>test -d dockerfiles || cp -pr ${spark-dockerfiles-dir} dockerfiles</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,10 @@ private[spark] class SparkDockerImageBuilder

private val DOCKER_BUILD_PATH = SPARK_DISTRO_PATH
// Dockerfile paths must be relative to the build path.
private val BASE_DOCKER_FILE = "dockerfiles/spark-base/Dockerfile"
private val DRIVER_DOCKER_FILE = "dockerfiles/driver/Dockerfile"
private val EXECUTOR_DOCKER_FILE = "dockerfiles/executor/Dockerfile"
private val DOCKERFILES_DIR = "kubernetes/dockerfiles/"
private val BASE_DOCKER_FILE = DOCKERFILES_DIR + "spark-base/Dockerfile"
private val DRIVER_DOCKER_FILE = DOCKERFILES_DIR + "driver/Dockerfile"
private val EXECUTOR_DOCKER_FILE = DOCKERFILES_DIR + "executor/Dockerfile"
private val TIMEOUT = PatienceConfiguration.Timeout(Span(2, Minutes))
private val INTERVAL = PatienceConfiguration.Interval(Span(2, Seconds))
private val dockerHost = dockerEnv.getOrElse("DOCKER_HOST",
Expand Down