You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-53874] SparkAppDriverConf should respect sparkVersion of SparkApplication CRD
### What changes were proposed in this pull request?
This PR aims to fix `SparkAppDriverConf` to respect `sparkVersion` of `SparkApplication` CRD.
### Why are the changes needed?
This is a long standing bug from the initial implementation.
- apache#10
Since Apache Spark K8s Operator can launch various Spark versions, `spark-version` label should come from `SparkApplication` CRD's `sparkVersion` field.
However, currently, the Spark version of compile dependency is used for `Driver` resources (like `Driver Pod` and `Driver Service`. We should override this.
### Does this PR introduce _any_ user-facing change?
Yes, this is a bug fix to use a correct version information.
### How was this patch tested?
Pass the CIs.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closesapache#385 from dongjoon-hyun/SPARK-53874.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
0 commit comments