Skip to content

Commit 8acbd77

Browse files
committed
[SPARK-52253] Support both v1alpha1 and v1beta1
### What changes were proposed in this pull request? This PR aims to support `v1alpha1` and `v1beta1`. - Both `v1alpha1` and `v1beta1` will be accepted and be stored as `v1beta1`. - Since the CRD generator always write a single version, we cannot use it directly from now. We should keep all previous versions in the CRD files inevitably. - It's also required to protect these CRDs from accidental and breaking changes from `Beta`. The added CRD files look like the following. Please note that `v1alpha` is `storage: false` and `v1beta1` is `storage: true`. ```yaml # ... APACHE HEADER ... apiVersion: apiextensions.k8s.io/v1 kind: CustomResourceDefinition metadata: name: sparkapplications.spark.apache.org spec: group: spark.apache.org names: kind: SparkApplication plural: sparkapplications shortNames: - sparkapp singular: sparkapplication scope: Namespaced versions: - name: v1alpha1 storage: false ... - name: v1beta1 storage: true ... ``` ### Why are the changes needed? - To support a smooth migration. - To be clear, there is no schema change from 0.2.0 to 0.3.0 so far. Only, versions are changed. ### Does this PR introduce _any_ user-facing change? No behavior change. ### How was this patch tested? Pass the CIs. After manual installation, ``` $ kubectl get crds sparkapplications.spark.apache.org -oyaml | yq '.spec.versions[].name' v1alpha1 v1beta1 $ kubectl get crds sparkapplications.spark.apache.org -oyaml | yq .status.storedVersions - v1beta1 ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #220 from dongjoon-hyun/SPARK-52253. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent add99bb commit 8acbd77

File tree

6 files changed

+31989
-6
lines changed

6 files changed

+31989
-6
lines changed

.github/workflows/build_and_test.yml

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,6 @@ jobs:
118118
run: |
119119
eval $(minikube docker-env)
120120
./gradlew buildDockerImage
121-
./gradlew spark-operator-api:relocateGeneratedCRD
122121
helm install spark-kubernetes-operator --create-namespace -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
123122
helm test spark-kubernetes-operator
124123
# Use remote host' s docker image
@@ -132,7 +131,6 @@ jobs:
132131
run: |
133132
eval $(minikube docker-env)
134133
./gradlew buildDockerImage
135-
./gradlew spark-operator-api:relocateGeneratedCRD
136134
helm install spark-kubernetes-operator --create-namespace -f \
137135
build-tools/helm/spark-kubernetes-operator/values.yaml -f \
138136
tests/e2e/helm/dynamic-config-values.yaml \

.github/workflows/publish_snapshot_chart.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ jobs:
3434
cache: 'gradle'
3535
- name: Build Operator
3636
run: |
37-
./gradlew build spark-operator-api:relocateGeneratedCRD -x check --no-daemon
37+
./gradlew build -x check --no-daemon
3838
- name: Build Chart
3939
env:
4040
DIR: 'charts'

.gitignore

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@
66
.vscode
77
/lib/
88
target/
9-
build-tools/helm/spark-kubernetes-operator/crds
109

1110
# Gradle Files #
1211
################

README.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,8 +45,6 @@ $ ./gradlew buildDockerImage
4545
## Install Helm Chart
4646

4747
```bash
48-
$ ./gradlew spark-operator-api:relocateGeneratedCRD
49-
5048
$ helm install spark -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
5149
```
5250

0 commit comments

Comments
 (0)