Skip to content

Commit 90e4777

Browse files
committed
[SPARK-53639] Use spark consistently for release-name of Helm installation
### What changes were proposed in this pull request? This PR aims to use `spark` consistently for `release-name` of Helm installation. ### Why are the changes needed? Although we use `spark` in `README.md` and website like the following, old legacy names still exist. https://github.com/apache/spark-kubernetes-operator/blob/cfd5063c595c445eee3d9d94e5dc9d17590ed5d4/README.md?plain=1#L49 We need to unify this. Otherwise, `INSTALLATION FAILED` occurs due to the mismatch of `meta.helm.sh/release-name`. ``` $ helm install sparkx --create-namespace -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/ Error: INSTALLATION FAILED: Unable to continue with install: ServiceAccount "spark-operator" in namespace "default" exists and cannot be imported into the current release: invalid ownership metadata; annotation validation error: key "meta.helm.sh/release-name" must equal "sparkx": current value is "spark" ``` ### Does this PR introduce _any_ user-facing change? No for the clean installations. For the users who have the existing installation, they need to remove old K8s objects before installation. ### How was this patch tested? Pass the CIs. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#323 from dongjoon-hyun/SPARK-53639. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent cfd5063 commit 90e4777

File tree

4 files changed

+6
-6
lines changed

4 files changed

+6
-6
lines changed

.github/workflows/build_and_test.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -125,8 +125,8 @@ jobs:
125125
run: |
126126
eval $(minikube docker-env)
127127
./gradlew buildDockerImage
128-
helm install spark-kubernetes-operator --create-namespace -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
129-
helm test spark-kubernetes-operator
128+
helm install spark --create-namespace -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
129+
helm test spark
130130
# Use remote host' s docker image
131131
minikube docker-env --unset
132132
- name: Run E2E Test with Dynamic Configuration Disabled
@@ -138,7 +138,7 @@ jobs:
138138
run: |
139139
eval $(minikube docker-env)
140140
./gradlew buildDockerImage
141-
helm install spark-kubernetes-operator --create-namespace -f \
141+
helm install spark --create-namespace -f \
142142
build-tools/helm/spark-kubernetes-operator/values.yaml -f \
143143
tests/e2e/helm/dynamic-config-values.yaml \
144144
build-tools/helm/spark-kubernetes-operator/

docs/configuration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ sink.PrometheusPullModelSink
122122
* Install Spark Operator
123123

124124
```bash
125-
helm install spark-kubernetes-operator -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
125+
helm install spark -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
126126
```
127127

128128
* Install Prometheus via Helm Chart

docs/operations.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ You can also provide multiple custom values file by using the `-f` flag, the lat
5050
higher precedence:
5151

5252
```bash
53-
helm install spark-kubernetes-operator \
53+
helm install spark \
5454
-f build-tools/helm/spark-kubernetes-operator/values.yaml \
5555
-f my_values.yaml \
5656
build-tools/helm/spark-kubernetes-operator/

tests/e2e/watched-namespaces/chainsaw-test.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ spec:
7676
- script:
7777
content: |
7878
echo "Installing another spark operator in default-2 namespaces, watching on namespace: spark-3"
79-
helm install spark-kubernetes-operator -n default-2 --create-namespace -f \
79+
helm install spark -n default-2 --create-namespace -f \
8080
../../../build-tools/helm/spark-kubernetes-operator/values.yaml -f \
8181
../helm/dynamic-config-values-2.yaml \
8282
../../../build-tools/helm/spark-kubernetes-operator/

0 commit comments

Comments
 (0)