Skip to content
This repository was archived by the owner on Jan 9, 2020. It is now read-only.

Commit f414355

Browse files
foxishash211
authored andcommitted
Add a section for prerequisites (#171)
* Adding prerequisites * address comments
1 parent 015f18d commit f414355

File tree

1 file changed

+8
-2
lines changed

1 file changed

+8
-2
lines changed

docs/running-on-kubernetes.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,15 @@ layout: global
33
title: Running Spark on Kubernetes
44
---
55

6-
Support for running on [Kubernetes](https://kubernetes.io/) is available in experimental status. The feature set is
6+
Support for running on [Kubernetes](https://kubernetes.io/docs/whatisk8s/) is available in experimental status. The feature set is
77
currently limited and not well-tested. This should not be used in production environments.
88

9+
## Prerequisites
10+
11+
* You must have a running Kubernetes cluster with access configured to it using [kubectl](https://kubernetes.io/docs/user-guide/prereqs/). If you do not already have a working Kubernetes cluster, you may setup a test cluster on your local machine using [minikube](https://kubernetes.io/docs/getting-started-guides/minikube/).
12+
* You must have appropriate permissions to create and list [pods](https://kubernetes.io/docs/user-guide/pods/), [nodes](https://kubernetes.io/docs/admin/node/) and [services](https://kubernetes.io/docs/user-guide/services/) in your cluster. You can verify that you can list these resources by running `kubectl get nodes`, `kubectl get pods` and `kubectl get svc` which should give you a list of nodes, pods and services (if any) respectively.
13+
* You must have an extracted spark distribution with Kubernetes support, or build one from [source](https://github.com/apache-spark-on-k8s/spark).
14+
915
## Setting Up Docker Images
1016

1117
Kubernetes requires users to supply images that can be deployed into containers within pods. The images are built to
@@ -81,7 +87,7 @@ the driver container as a [secret volume](https://kubernetes.io/docs/user-guide/
8187
### Kubernetes Clusters and the authenticated proxy endpoint
8288

8389
Spark-submit also supports submission through the
84-
[local kubectl proxy](https://kubernetes.io/docs/user-guide/connecting-to-applications-proxy/). One can use the
90+
[local kubectl proxy](https://kubernetes.io/docs/user-guide/accessing-the-cluster/#using-kubectl-proxy). One can use the
8591
authenticating proxy to communicate with the api server directly without passing credentials to spark-submit.
8692

8793
The local proxy can be started by running:

0 commit comments

Comments
 (0)