This repository was archived by the owner on Jan 9, 2020. It is now read-only.
File tree Expand file tree Collapse file tree 1 file changed +3
-4
lines changed Expand file tree Collapse file tree 1 file changed +3
-4
lines changed Original file line number Diff line number Diff line change @@ -5,7 +5,7 @@ title: Running Spark on Kubernetes
5
5
* This will become a table of contents (this text will be scraped).
6
6
{: toc }
7
7
8
- Spark can run on clusters managed by [ Kubernetes] ( https://kubernetes.io ) . This feature makes use of the new experimental native
8
+ Spark can run on clusters managed by [ Kubernetes] ( https://kubernetes.io ) . This feature makes use of native
9
9
Kubernetes scheduler that has been added to Spark.
10
10
11
11
# Prerequisites
@@ -71,15 +71,14 @@ To launch Spark Pi in cluster mode,
71
71
72
72
{% highlight bash %}
73
73
$ bin/spark-submit \
74
+ --master k8s://https://<k8s-apiserver-host >:<k8s-apiserver-port > \
74
75
--deploy-mode cluster \
75
76
--class org.apache.spark.examples.SparkPi \
76
- --master k8s://https://<k8s-apiserver-host >:<k8s-apiserver-port > \
77
- --conf spark.kubernetes.namespace=default \
78
77
--conf spark.executor.instances=5 \
79
78
--conf spark.app.name=spark-pi \
80
79
--conf spark.kubernetes.driver.docker.image=<driver-image > \
81
80
--conf spark.kubernetes.executor.docker.image=<executor-image > \
82
- local:///opt/spark /examples/jars/spark-examples_2.11-2.3.0 .jar
81
+ local:///path/to /examples.jar
83
82
{% endhighlight %}
84
83
85
84
The Spark master, specified either via passing the ` --master ` command line argument to ` spark-submit ` or by setting
You can’t perform that action at this time.
0 commit comments