Skip to content

Conversation

@andygrove
Copy link
Member

@andygrove andygrove commented Dec 23, 2025

Which issue does this PR close?

Workaround for #2965

Rationale for this change

This test is failing most of the time in all PR builds.

What changes are included in this PR?

How are these changes tested?

@andygrove
Copy link
Member Author

@manuzhang I think we can just skip the "fuzz" suite and keep the other Spark 4 tests active for macOS

@codecov-commenter
Copy link

codecov-commenter commented Dec 23, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 59.59%. Comparing base (f09f8af) to head (43d928e).
⚠️ Report is 793 commits behind head on main.

Additional details and impacted files
@@             Coverage Diff              @@
##               main    #2966      +/-   ##
============================================
+ Coverage     56.12%   59.59%   +3.47%     
- Complexity      976     1381     +405     
============================================
  Files           119      167      +48     
  Lines         11743    15492    +3749     
  Branches       2251     2568     +317     
============================================
+ Hits           6591     9233    +2642     
- Misses         4012     4959     +947     
- Partials       1140     1300     +160     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@manuzhang
Copy link
Member

This failed test looks like a flaky one

[info] - foreach with error not caused by ForeachWriter *** FAILED *** (10 seconds, 166 milliseconds)
[info]   The code passed to eventually never returned normally. Attempted 649 times over 10.005631396999998 seconds. Last failure message: -1 did not equal 0. (SQLTestUtils.scala:217)
[info]   org.scalatest.exceptions.TestFailedDueToTimeoutException:
[info]   at org.scalatest.enablers.Retrying$$anon$4.tryTryAgain$2(Retrying.scala:219)
[info]   at org.scalatest.enablers.Retrying$$anon$4.retry(Retrying.scala:226)
[info]   at org.scalatest.concurrent.Eventually.eventually(Eventually.scala:348)
[info]   at org.scalatest.concurrent.Eventually.eventually$(Eventually.scala:347)
[info]   at org.apache.spark.sql.execution.streaming.sources.ForeachWriterSuite.eventually(ForeachWriterSuite.scala:33)

@andygrove andygrove requested a review from comphead December 23, 2025 15:09
Copy link
Contributor

@comphead comphead left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @andygrove
btw WDYT to optimize test process in future and start fuzz testing before the release only or if some specific files were changed?

@andygrove
Copy link
Member Author

Thanks @andygrove btw WDYT to optimize test process in future and start fuzz testing before the release only or if some specific files were changed?

That sounds great. Do you have an idea about how we can achieve this?

@andygrove andygrove merged commit 97dd7bc into apache:main Dec 23, 2025
233 of 237 checks passed
@andygrove andygrove deleted the skip-fuzz-macos-spark4 branch December 23, 2025 16:40
@comphead
Copy link
Contributor

Thanks @andygrove btw WDYT to optimize test process in future and start fuzz testing before the release only or if some specific files were changed?

That sounds great. Do you have an idea about how we can achieve this?

We can take Datafusion examples https://github.com/apache/datafusion/blob/33ac70dd6d634da040cc34abd414425b176a2b99/.github/workflows/extended.yml#L43

so extended tests are triggered if PR is for release branch branch-* or if any PR modifies core in

      - 'datafusion/physical*/**/*.rs'
      - 'datafusion/expr*/**/*.rs'
      - 'datafusion/optimizer/**/*.rs'

With latest PRs Comet modularity changes can open doors now for this test conditional trigger?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants