Commit 9cf0bac
committed
[SPARK-51538] Add
### What changes were proposed in this pull request?
This PR aims to add an example to launch `Spark Connect Server` pod which uses the existing `Spark Cluster`.
### Why are the changes needed?
Since `Spark Connect Server` pod is launched outside of `Spark Cluster`, we can expose the service port more easily.
### Does this PR introduce _any_ user-facing change?
No. This is a new example.
### How was this patch tested?
Manual testing.
Launch `Spark K8s Operator`, `Spark Cluster`, and `Spark Connect Server` sequentially.
```
$ helm install spark-kubernetes-operator \
https://nightlies.apache.org/spark/charts/spark-kubernetes-operator-0.1.0-SNAPSHOT.tgz
$ kubectl apply -f examples/prod-cluster-with-three-workers.yaml
$ kubectl apply -f examples/spark-connect-server-with-spark-cluster.yaml
```
After exporting the `Spark Master` and `Spark Connect Server` ports, connect with `pyspark` client.
```
$ bin/pyspark --remote sc://localhost:15002
```
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #169 from dongjoon-hyun/SPARK-51538.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>spark-connect-server-with-spark-cluster.yaml example1 parent 172c5b5 commit 9cf0bac
1 file changed
+30
-0
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
0 commit comments