Skip to content

Commit d356b17

Browse files
committed
[SPARK-50380][SQL][TESTS][FOLLOWUP] Enable ANSI for conditional branches with error expression test
### What changes were proposed in this pull request? This is a follow-up to recover non-ANSI CI. - #48918 ### Why are the changes needed? The original PR broke non-ANSI CI because the test case assumes ANSI setting. - https://github.com/apache/spark/actions/runs/11964792566 - https://github.com/apache/spark/actions/runs/11982859814 ### Does this PR introduce _any_ user-facing change? No, this is a test-only change. ### How was this patch tested? Manual tests. **BEFORE** ``` $ SPARK_ANSI_SQL_MODE=false build/sbt "catalyst/testOnly *.ReorderAssociativeOperatorSuite -- -z SPARK-50380" ... [info] *** 1 TEST FAILED *** [error] Failed tests: [error] org.apache.spark.sql.catalyst.optimizer.ReorderAssociativeOperatorSuite [error] (catalyst / Test / testOnly) sbt.TestsFailedException: Tests unsuccessful [error] Total time: 8 s, completed Nov 23, 2024, 11:50:45 AM ``` **AFTER** ``` $ SPARK_ANSI_SQL_MODE=false build/sbt "catalyst/testOnly *.ReorderAssociativeOperatorSuite -- -z SPARK-50380" ... [info] ReorderAssociativeOperatorSuite: [info] - SPARK-50380: conditional branches with error expression (508 milliseconds) [info] Run completed in 1 second, 413 milliseconds. [info] Total number of tests run: 1 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [success] Total time: 11 s, completed Nov 23, 2024, 11:51:34 AM ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #48943 from dongjoon-hyun/SPARK-50380. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 656ece1 commit d356b17

File tree

1 file changed

+13
-10
lines changed

1 file changed

+13
-10
lines changed

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/ReorderAssociativeOperatorSuite.scala

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ import org.apache.spark.sql.catalyst.expressions.aggregate.Count
2424
import org.apache.spark.sql.catalyst.plans.{Inner, PlanTest}
2525
import org.apache.spark.sql.catalyst.plans.logical.{LocalRelation, LogicalPlan}
2626
import org.apache.spark.sql.catalyst.rules.RuleExecutor
27+
import org.apache.spark.sql.internal.SQLConf
2728

2829
class ReorderAssociativeOperatorSuite extends PlanTest {
2930

@@ -109,15 +110,17 @@ class ReorderAssociativeOperatorSuite extends PlanTest {
109110
}
110111

111112
test("SPARK-50380: conditional branches with error expression") {
112-
val originalQuery1 = testRelation.select(If($"a" === 1, 1L, Literal(1).div(0) + $"b")).analyze
113-
val optimized1 = Optimize.execute(originalQuery1)
114-
comparePlans(optimized1, originalQuery1)
115-
116-
val originalQuery2 = testRelation.select(
117-
If($"a" === 1, 1, ($"b" + Literal(Int.MaxValue)) + 1).as("col")).analyze
118-
val optimized2 = Optimize.execute(originalQuery2)
119-
val correctAnswer2 = testRelation.select(
120-
If($"a" === 1, 1, $"b" + (Literal(Int.MaxValue) + 1)).as("col")).analyze
121-
comparePlans(optimized2, correctAnswer2)
113+
withSQLConf(SQLConf.ANSI_ENABLED.key -> true.toString) {
114+
val originalQuery1 = testRelation.select(If($"a" === 1, 1L, Literal(1).div(0) + $"b")).analyze
115+
val optimized1 = Optimize.execute(originalQuery1)
116+
comparePlans(optimized1, originalQuery1)
117+
118+
val originalQuery2 = testRelation.select(
119+
If($"a" === 1, 1, ($"b" + Literal(Int.MaxValue)) + 1).as("col")).analyze
120+
val optimized2 = Optimize.execute(originalQuery2)
121+
val correctAnswer2 = testRelation.select(
122+
If($"a" === 1, 1, $"b" + (Literal(Int.MaxValue) + 1)).as("col")).analyze
123+
comparePlans(optimized2, correctAnswer2)
124+
}
122125
}
123126
}

0 commit comments

Comments
 (0)