Skip to content
This repository was archived by the owner on Jan 9, 2020. It is now read-only.

Sync hdfs-kerberos-support to branch-2.2-kubernetes #472

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
a8330eb
Merge pull request #388 from apache-spark-on-k8s/branch-2.2-kubernetes-g
foxish Jul 25, 2017
64f3ddd
Add missing code blocks (#403)
erikerlandson Jul 28, 2017
bce9b77
Add an entrypoint.sh script to add a passwd entry if one does not exi…
erikerlandson Jul 28, 2017
8ecff61
revert my COPY mods
erikerlandson Jul 29, 2017
702a8f6
Fix bug with null arguments
ifilonenko Aug 1, 2017
2c5d784
Merge pull request #407 from bloomberg/python-testing
foxish Aug 3, 2017
fa67455
Merge pull request #404 from erikerlandson/anonymous-uids
foxish Aug 3, 2017
5fdaa7f
Exclude com.sun.jersey from docker-minimal-bundle. (#420)
mccheah Aug 8, 2017
e3cfaa4
Flag-guard expensive DNS lookup of cluster node full names, part of H…
kimoonkim Aug 8, 2017
bd50627
fixes #389 - increase SparkReadinessWatcher wait time (#419)
erikerlandson Aug 8, 2017
24cd9ee
Initial architecture documentation. (#401)
mccheah Aug 8, 2017
372ae41
Allow configuration to set environment variables on driver and execut…
Aug 9, 2017
410dc9c
version 2.2.0-k8s-0.3.0
erikerlandson Aug 9, 2017
737abdc
bump to 2.2.0-k8s-0.4.0-SNAPSHOT
erikerlandson Aug 9, 2017
a46b4a3
Revert "bump to 2.2.0-k8s-0.4.0-SNAPSHOT"
erikerlandson Aug 10, 2017
ff601a3
Revert "version 2.2.0-k8s-0.3.0"
erikerlandson Aug 10, 2017
19f49d0
version 2.2.0-k8s-0.3.0
erikerlandson Aug 10, 2017
982760c
bump to 2.2.0-k8s-0.4.0-SNAPSHOT
erikerlandson Aug 10, 2017
cb645ca
Update external shuffle service docs
foxish Aug 14, 2017
437eb89
Updated with documentation (#430)
foxish Aug 14, 2017
6ab02e2
Merge pull request #431 from apache-spark-on-k8s/foxish-patch-2
foxish Aug 14, 2017
3b3aeb7
Link to architecture docs (#432)
foxish Aug 14, 2017
6e1d69e
Removed deprecated option from pom (#433)
foxish Aug 14, 2017
c457f10
Support HDFS rack locality (#350)
kimoonkim Aug 17, 2017
4a322ad
Fix license check (#442)
ash211 Aug 18, 2017
f8cf9db
Scalastyle (#446)
ash211 Aug 21, 2017
455317d
Use a secret to mount small files in driver and executors. (#437)
mccheah Aug 21, 2017
58cebd1
Updated devloper doc to include a install step for first time compila…
liyinan926 Aug 21, 2017
e44d81a
Support service account override
kimoonkim Aug 22, 2017
f7b5820
Use a list of environment variables for JVM options. (#444)
mccheah Aug 22, 2017
7959fc5
Fix indentation
kimoonkim Aug 22, 2017
2cb2074
Support executor java options. (#445)
mccheah Aug 23, 2017
0c160f5
Bumping versions to v2.2.0-kubernetes-0.3.0
Aug 24, 2017
d6e922d
Properly wrap getOrElse in a tuple (#458)
mccheah Aug 24, 2017
dca9b04
Merge pull request #460 from sahilprasad/bump-shuffle-version
liyinan926 Aug 24, 2017
e600a07
Merge pull request #451 from kimoonkim/override-service-account
foxish Aug 24, 2017
6177bf8
Add command echoing for better command debugging (#462)
erikerlandson Aug 25, 2017
c6bc19d
Fix conversion from GB to MiB (#470)
ash211 Aug 30, 2017
5c29bf8
Sync'd to branch-2.2-kubernetes
kimoonkim Aug 30, 2017
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ This is a collaboratively maintained project working on [SPARK-18278](https://is

- [Usage guide](https://apache-spark-on-k8s.github.io/userdocs/) shows how to run the code
- [Development docs](resource-managers/kubernetes/README.md) shows how to get set up for development
- [Architecture docs](resource-managers/kubernetes/architecture-docs/) shows the high level architecture of Spark on Kubernetes
- Code is primarily located in the [resource-managers/kubernetes](resource-managers/kubernetes) folder

## Why does this fork exist?
Expand Down
2 changes: 1 addition & 1 deletion assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-shuffle/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-yarn/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/sketch/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/tags/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/unsafe/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion conf/kubernetes-resource-staging-server.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ spec:
name: spark-resource-staging-server-config
containers:
- name: spark-resource-staging-server
image: kubespark/spark-resource-staging-server:v2.1.0-kubernetes-0.2.0
image: kubespark/spark-resource-staging-server:v2.2.0-kubernetes-0.3.0
resources:
requests:
cpu: 100m
Expand Down
8 changes: 4 additions & 4 deletions conf/kubernetes-shuffle-service.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,14 +20,14 @@ kind: DaemonSet
metadata:
labels:
app: spark-shuffle-service
spark-version: 2.1.0
spark-version: 2.2.0
name: shuffle
spec:
template:
metadata:
labels:
app: spark-shuffle-service
spark-version: 2.1.0
spark-version: 2.2.0
spec:
volumes:
- name: temp-volume
Expand All @@ -38,7 +38,7 @@ spec:
# This is an official image that is built
# from the dockerfiles/shuffle directory
# in the spark distribution.
image: kubespark/spark-shuffle:v2.1.0-kubernetes-0.2.0
image: kubespark/spark-shuffle:v2.2.0-kubernetes-0.3.0
imagePullPolicy: IfNotPresent
volumeMounts:
- mountPath: '/tmp'
Expand All @@ -51,4 +51,4 @@ spec:
requests:
cpu: "1"
limits:
cpu: "1"
cpu: "1"
2 changes: 1 addition & 1 deletion core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -639,7 +639,9 @@ object SparkSubmit extends CommandLineUtils {
if (args.isPython) {
childArgs ++= Array("--primary-py-file", args.primaryResource)
childArgs ++= Array("--main-class", "org.apache.spark.deploy.PythonRunner")
childArgs ++= Array("--other-py-files", args.pyFiles)
if (args.pyFiles != null) {
childArgs ++= Array("--other-py-files", args.pyFiles)
}
} else {
childArgs ++= Array("--primary-java-resource", args.primaryResource)
childArgs ++= Array("--main-class", args.mainClass)
Expand Down
66 changes: 48 additions & 18 deletions docs/running-on-kubernetes.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,10 @@ cluster, you may setup a test cluster on your local machine using
* You must have appropriate permissions to create and list [pods](https://kubernetes.io/docs/user-guide/pods/),
[ConfigMaps](https://kubernetes.io/docs/tasks/configure-pod-container/configmap/) and
[secrets](https://kubernetes.io/docs/concepts/configuration/secret/) in your cluster. You can verify that
you can list these resources by running `kubectl get pods` `kubectl get configmap`, and `kubectl get secrets` which
you can list these resources by running `kubectl get pods`, `kubectl get configmap`, and `kubectl get secrets` which
should give you a list of pods and configmaps (if any) respectively.
* The service account or credentials used by the driver pods must have appropriate permissions
as well for editing pod spec.
* You must have a spark distribution with Kubernetes support. This may be obtained from the
[release tarball](https://github.com/apache-spark-on-k8s/spark/releases) or by
[building Spark with Kubernetes support](../resource-managers/kubernetes/README.md#building-spark-with-kubernetes-support).
Expand All @@ -36,15 +38,15 @@ If you wish to use pre-built docker images, you may use the images published in
<tr><th>Component</th><th>Image</th></tr>
<tr>
<td>Spark Driver Image</td>
<td><code>kubespark/spark-driver:v2.1.0-kubernetes-0.2.0</code></td>
<td><code>kubespark/spark-driver:v2.2.0-kubernetes-0.3.0</code></td>
</tr>
<tr>
<td>Spark Executor Image</td>
<td><code>kubespark/spark-executor:v2.1.0-kubernetes-0.2.0</code></td>
<td><code>kubespark/spark-executor:v2.2.0-kubernetes-0.3.0</code></td>
</tr>
<tr>
<td>Spark Initialization Image</td>
<td><code>kubespark/spark-init:v2.1.0-kubernetes-0.2.0</code></td>
<td><code>kubespark/spark-init:v2.2.0-kubernetes-0.3.0</code></td>
</tr>
</table>

Expand Down Expand Up @@ -80,9 +82,9 @@ are set up as described above:
--kubernetes-namespace default \
--conf spark.executor.instances=5 \
--conf spark.app.name=spark-pi \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.2.0-kubernetes-0.3.0 \
local:///opt/spark/examples/jars/spark_examples_2.11-2.2.0.jar

The Spark master, specified either via passing the `--master` command line argument to `spark-submit` or by setting
Expand All @@ -107,6 +109,18 @@ Finally, notice that in the above example we specify a jar with a specific URI w
the location of the example jar that is already in the Docker image. Using dependencies that are on your machine's local
disk is discussed below.

When Kubernetes [RBAC](https://kubernetes.io/docs/admin/authorization/rbac/) is enabled,
the `default` service account used by the driver may not have appropriate pod `edit` permissions
for launching executor pods. We recommend to add another service account, say `spark`, with
the necessary privilege. For example:

kubectl create serviceaccount spark
kubectl create clusterrolebinding spark-edit --clusterrole edit \
--serviceaccount default:spark --namespace default

With this, one can add `--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark` to
the spark-submit command line above to specify the service account to use.

## Dependency Management

Application dependencies that are being submitted from your machine need to be sent to a **resource staging server**
Expand All @@ -129,9 +143,9 @@ and then you can compute the value of Pi as follows:
--kubernetes-namespace default \
--conf spark.executor.instances=5 \
--conf spark.app.name=spark-pi \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.resourceStagingServer.uri=http://<address-of-any-cluster-node>:31000 \
examples/jars/spark_examples_2.11-2.2.0.jar

Expand Down Expand Up @@ -170,9 +184,9 @@ If our local proxy were listening on port 8001, we would have our submission loo
--kubernetes-namespace default \
--conf spark.executor.instances=5 \
--conf spark.app.name=spark-pi \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.2.0-kubernetes-0.3.0 \
local:///opt/spark/examples/jars/spark_examples_2.11-2.2.0.jar

Communication between Spark and Kubernetes clusters is performed using the fabric8 kubernetes-client library.
Expand Down Expand Up @@ -220,7 +234,7 @@ service because there may be multiple shuffle service instances running in a clu
a way to target a particular shuffle service.

For example, if the shuffle service we want to use is in the default namespace, and
has pods with labels `app=spark-shuffle-service` and `spark-version=2.1.0`, we can
has pods with labels `app=spark-shuffle-service` and `spark-version=2.2.0`, we can
use those tags to target that particular shuffle service at job launch time. In order to run a job with dynamic allocation enabled,
the command may then look like the following:

Expand All @@ -235,7 +249,7 @@ the command may then look like the following:
--conf spark.dynamicAllocation.enabled=true \
--conf spark.shuffle.service.enabled=true \
--conf spark.kubernetes.shuffle.namespace=default \
--conf spark.kubernetes.shuffle.labels="app=spark-shuffle-service,spark-version=2.1.0" \
--conf spark.kubernetes.shuffle.labels="app=spark-shuffle-service,spark-version=2.2.0" \
local:///opt/spark/examples/jars/spark_examples_2.11-2.2.0.jar 10 400000 2

## Advanced
Expand Down Expand Up @@ -312,9 +326,9 @@ communicate with the resource staging server over TLS. The trustStore can be set
--kubernetes-namespace default \
--conf spark.executor.instances=5 \
--conf spark.app.name=spark-pi \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.1.0-kubernetes-0.2.0 \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.initcontainer.docker.image=kubespark/spark-init:v2.2.0-kubernetes-0.3.0 \
--conf spark.kubernetes.resourceStagingServer.uri=https://<address-of-any-cluster-node>:31000 \
--conf spark.ssl.kubernetes.resourceStagingServer.enabled=true \
--conf spark.ssl.kubernetes.resourceStagingServer.clientCertPem=/home/myuser/cert.pem \
Expand Down Expand Up @@ -768,6 +782,22 @@ from the other deployment modes. See the [configuration page](configuration.html
<code>myIdentifier</code>. Multiple node selector keys can be added by setting multiple configurations with this prefix.
</td>
</tr>
<tr>
<td><code>spark.executorEnv.[EnvironmentVariableName]</code></td>
<td>(none)</td>
<td>
Add the environment variable specified by <code>EnvironmentVariableName</code> to
the Executor process. The user can specify multiple of these to set multiple environment variables.
</td>
</tr>
<tr>
<td><code>spark.kubernetes.driverEnv.[EnvironmentVariableName]</code></td>
<td>(none)</td>
<td>
Add the environment variable specified by <code>EnvironmentVariableName</code> to
the Driver process. The user can specify multiple of these to set multiple environment variables.
</td>
</tr>
</table>


Expand Down
2 changes: 1 addition & 1 deletion examples/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/docker-integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/flume-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/flume-sink/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/flume/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/java8-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kafka-0-10-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kafka-0-10-sql/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kafka-0-10/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kafka-0-8-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kafka-0-8/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kinesis-asl-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion external/kinesis-asl/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.2.0-k8s-0.3.0-SNAPSHOT</version>
<version>2.2.0-k8s-0.4.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
Loading