-
Notifications
You must be signed in to change notification settings - Fork 4.5k
Update GitHub Actions to run Python 3.13 Tests #35056
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #35056 +/- ##
============================================
- Coverage 40.20% 40.20% -0.01%
Complexity 3386 3386
============================================
Files 1220 1220
Lines 186149 186175 +26
Branches 3523 3523
============================================
+ Hits 74839 74844 +5
- Misses 107955 107976 +21
Partials 3355 3355
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
This pull request has been marked as stale due to 60 days of inactivity. It will be closed in 1 week if no further activity occurs. If you think that’s incorrect or this pull request requires a review, please simply write any comment. If closed, you can revive the PR at any time and @mention a reviewer or discuss it on the [email protected] list. Thank you for your contributions. |
|
This pull request has been marked as stale due to 60 days of inactivity. It will be closed in 1 week if no further activity occurs. If you think that’s incorrect or this pull request requires a review, please simply write any comment. If closed, you can revive the PR at any time and @mention a reviewer or discuss it on the [email protected] list. Thank you for your contributions. |
|
The only current hangup is that I cannot get a clean precommit run off of the branch, but the test failures aren't consistent so it's hard to tell if we're really getting breakages or just flakes. |
damccorm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks - just had a few minor comments
sdks/python/apache_beam/runners/interactive/testing/integration/tests/screen_diff_test.py
Outdated
Show resolved
Hide resolved
| # (TODO): https://github.com/apache/beam/issues/21971 | ||
| # Add python 3.10 to dataflow test-suites | ||
| dataflow_precommit_it_task_py_versions=3.9,3.12 | ||
| dataflow_precommit_it_task_py_versions=3.9,3.13 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could be a follow up (non-blocking here) - it would be nice to refactor 3.9 and 3.13 into max_py_version and min_py_version variables
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed
sdks/python/setup.py
Outdated
| 'torch', | ||
| 'transformers', | ||
| ], | ||
| 'p313_ml_test': [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rather than adding a p3XX_ml_test version for every supported python version, could we rename p312_ml_test to something like ml_basic_test? I think ml_test is just this + dill and tensorflow-transform FWIW
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know this is technically breaking, but I would be surprised if anyone is depending on this directly (and we could call this out in CHANGES)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Kind of depends on how we want to go about it, since the datatables dependency does work in 3.12 but does not support 3.13. I'm not sure how many test cases that potentially takes off of 3.12 runs, but I am not opposed to it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, what if we do something in the middle? We could define a ml_base dependency like we do with some other deps (e.g.
Line 159 in be11a3e
| dataframe_dependency = [ |
Then we can have:
'ml_test': [
'datatable',
'dill',
'tensorflow-transformers'
] + ml_base,
'p312_ml_test': [
'datatable'
] + ml_base,
'p313_ml_test': ml_base
Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
took a run at it in the latest version
|
Clean precommit run - https://github.com/apache/beam/actions/runs/18200511639/job/51817950160 |
damccorm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM once checks pass
|
Checks are failing. Will not request review until checks are succeeding. If you'd like to override that behavior, comment |
|
apache_beam.ml.rag.enrichment.milvus_search_it_test.TestMilvusSearchEnrichment seems to be consistently red on python 3.12. |
|
Remaining test failures look like #36377 |
|
sounds good, thank you |
| uses: ./.github/actions/setup-environment-action | ||
| with: | ||
| python-version: 3.12 | ||
| python-version: 3.13 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tests failing due to "sh: 1: python3.12: not found" after this change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed by #36389
Updates workflows to publish 3.13 containers and execute test suites against Python 3.13.
Closes #34869
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>instead.CHANGES.mdwith noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.