Enable authZ compile support for Spark 4.0 and refactor some test methods#7256
Closed
pan3793 wants to merge 2 commits intoapache:masterfrom
Closed
Enable authZ compile support for Spark 4.0 and refactor some test methods#7256pan3793 wants to merge 2 commits intoapache:masterfrom
pan3793 wants to merge 2 commits intoapache:masterfrom
Conversation
pan3793
commented
Nov 24, 2025
| protected val sql: String => DataFrame = spark.sql | ||
|
|
||
| protected def doAs[T](user: String, f: => T): T = { | ||
| protected def doAs[T](user: String, f: => T, unused: String = ""): T = { |
Member
Author
There was a problem hiding this comment.
to keep both, I have to add a dummy parameter, otherwise, the compiler complains
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/SparkSessionProvider.scala:94: double definition:
protected def doAs[T](user: String, f: => T): T at line 87 and
protected def doAs[T](user: String)(f: => T): T at line 94
have same type after erasure: (user: String, f: Function0): Object
pan3793
commented
Nov 24, 2025
Comment on lines
-99
to
+109
| case (t, "table") => doAs( | ||
| admin, { | ||
| val purgeOption = | ||
| if (isCatalogSupportPurge( | ||
| spark.sessionState.catalogManager.currentCatalog.name())) { | ||
| "PURGE" | ||
| } else "" | ||
| sql(s"DROP TABLE IF EXISTS $t $purgeOption") | ||
| }) | ||
| case (t, "table") => doAs(admin) { | ||
| val purgeOption = if (supportPurge) "PURGE" else "" | ||
| sql(s"DROP TABLE IF EXISTS $t $purgeOption") | ||
| } |
Member
Author
There was a problem hiding this comment.
def doAs[T](user: String)(f: => T) has more pretty format here.
cxzl25
approved these changes
Nov 24, 2025
bowenliang123
approved these changes
Nov 24, 2025
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #7256 +/- ##
======================================
Coverage 0.00% 0.00%
======================================
Files 696 696
Lines 43530 43530
Branches 5883 5883
======================================
Misses 43530 43530 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Member
Author
|
Thanks, merged to master/1.10 |
pan3793
added a commit
that referenced
this pull request
Nov 24, 2025
…r some test methods
This PR enables authZ compile support for Spark 4.0
```
build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
```
```
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:19: object Strategy is not a member of package org.apache.spark.sql
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:23: not found: type Strategy
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala:58: type mismatch;
found : org.apache.kyuubi.plugin.spark.authz.rule.rowfilter.FilterDataSourceV2Strategy.type
required: v1.StrategyBuilder
(which expands to) org.apache.spark.sql.SparkSession => org.apache.spark.sql.execution.SparkStrategy
[ERROR] three errors found
```
In addition, it refactors two methods in the test helper class `SparkSessionProvider`
1. Refactor `isCatalogSupportPurge` to an abstract method `supportPurge` because some UTs do not rely on the current catalog.
2. Add a new helper method `def doAs[T](user: String)(f: => T): T`, now the caller can use it
```
doAs("someone") {
...
}
```
Pass GHA to ensure it breaks nothing, manually tested Spark 4.0 compile
```
build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
```
No.
Closes #7256 from pan3793/authz-refactor.
Closes #7256
b84cec8 [Cheng Pan] add missing override
ede364f [Cheng Pan] Enable authZ compile support for Spark 4.0 and refactor some test methods
Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 8a67796)
Signed-off-by: Cheng Pan <chengpan@apache.org>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Why are the changes needed?
This PR enables authZ compile support for Spark 4.0
In addition, it refactors two methods in the test helper class
SparkSessionProviderisCatalogSupportPurgeto an abstract methodsupportPurgebecause some UTs do not rely on the current catalog.def doAs[T](user: String)(f: => T): T, now the caller can use itHow was this patch tested?
Pass GHA to ensure it breaks nothing, manually tested Spark 4.0 compile
Was this patch authored or co-authored using generative AI tooling?
No.