-
Notifications
You must be signed in to change notification settings - Fork 106
NO-JIRA: workflows: allow parallelism with concurrency groups by adding Python version matrix in piplock-renewal.yaml
#1714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ng Python version matrix in `piplock-renewal.yaml`
Warning Rate limit exceeded@openshift-ci[bot] has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 15 minutes and 11 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ⛔ Files ignored due to path filters (15)
📒 Files selected for processing (10)
WalkthroughThe concurrency group configuration in the Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~7 minutes Possibly related PRs
Suggested labels
Suggested reviewers
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
/hold have to undo the actual piplock renewal |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (12)
codeserver/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
jupyter/datascience/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
jupyter/minimal/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
jupyter/pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
jupyter/rocm/pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
jupyter/tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
jupyter/trustyai/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/datascience/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/minimal/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
📒 Files selected for processing (12)
codeserver/ubi9-python-3.12/requirements.txt
(1 hunks)jupyter/datascience/ubi9-python-3.12/requirements.txt
(2 hunks)jupyter/minimal/ubi9-python-3.12/requirements.txt
(2 hunks)jupyter/pytorch/ubi9-python-3.12/requirements.txt
(2 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt
(2 hunks)jupyter/tensorflow/ubi9-python-3.12/requirements.txt
(2 hunks)jupyter/trustyai/ubi9-python-3.12/requirements.txt
(3 hunks)runtimes/datascience/ubi9-python-3.12/requirements.txt
(2 hunks)runtimes/minimal/ubi9-python-3.12/requirements.txt
(2 hunks)runtimes/pytorch/ubi9-python-3.12/requirements.txt
(2 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt
(2 hunks)runtimes/tensorflow/ubi9-python-3.12/requirements.txt
(2 hunks)
✅ Files skipped from review due to trivial changes (11)
- codeserver/ubi9-python-3.12/requirements.txt
- runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt
- runtimes/minimal/ubi9-python-3.12/requirements.txt
- jupyter/tensorflow/ubi9-python-3.12/requirements.txt
- runtimes/tensorflow/ubi9-python-3.12/requirements.txt
- runtimes/pytorch/ubi9-python-3.12/requirements.txt
- runtimes/datascience/ubi9-python-3.12/requirements.txt
- jupyter/datascience/ubi9-python-3.12/requirements.txt
- jupyter/trustyai/ubi9-python-3.12/requirements.txt
- jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt
- jupyter/pytorch/ubi9-python-3.12/requirements.txt
🧰 Additional context used
🧠 Learnings (11)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-jupyter-datascience-cpu-py312-ubi9-push.yaml:14-17
Timestamp: 2025-07-11T11:15:47.424Z
Learning: jiridanek requested GitHub issue creation for CEL filter problem in datascience workbench Tekton pipelines during PR #1379 review. Issue #1383 was successfully created with comprehensive problem description covering both Python 3.11 and 3.12 pipelines incorrectly watching jupyter/minimal directories instead of jupyter/datascience directories, detailed impact analysis of pipeline execution failures, complete solution with before/after code examples, thorough acceptance criteria for path updates and pipeline triggering verification, implementation notes about repository structure alignment, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yaml:38-142
Timestamp: 2025-07-11T11:14:26.823Z
Learning: jiridanek requested GitHub issue creation for missing PipelineRun timeout configuration during PR #1379 review, affecting 18 out of 32 total PipelineRun files across both Python 3.11 and 3.12 pipelines. A comprehensive issue was created covering resource monopolization risks, complete affected files analysis, standard 8-hour timeout solution, detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yaml:38-142
Timestamp: 2025-07-11T11:14:26.823Z
Learning: jiridanek requested GitHub issue creation for missing PipelineRun timeout configuration during PR #1379 review, affecting 18 out of 32 total PipelineRun files across both Python 3.11 and 3.12 pipelines. Issue #1380 was successfully created with comprehensive problem description covering resource monopolization risks, complete affected files analysis (18 missing, 14 correct), standard 8-hour timeout solution with code examples, detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T09:12:44.088Z
Learning: jiridanek requested GitHub issue creation for GitHub Actions artifact naming conflict during PR #1357 review, specifically for a failing actions/upload-artifactv4 step with 409 Conflict error. Issue was created with comprehensive problem description covering artifact naming conflicts, root cause analysis of duplicate names in concurrent workflows, four solution options (enhanced naming, overwriting, conditional uploads, matrix-aware naming) with code examples, detailed acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic CI/CD and code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-21T15:06:04.114Z
Learning: jiridanek requested GitHub issue creation for multi-platform dependency locking investigation during PR #1396 review. Issue #1423 was successfully created with comprehensive problem description covering ARM64 wheel availability but being ignored due to AMD64-only dependency locking, root cause analysis of platform-specific pipenv limitations, immediate conditional installation solution, multi-platform locking ecosystem analysis, broader affected areas investigation, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for adding RStudio py311 Tekton push pipelines during PR #1379 review, referencing existing registry entries in manifests/base/params-latest.env but missing corresponding .tekton pipeline files. A comprehensive issue was created with detailed problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:19:31.295Z
Learning: In the opendatahub-io/notebooks repository, the Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` (format: `mcr.microsoft.com/playwright:v1.53.1-noble`) must always match the `playwright/test` version specified in the `tests/browser/package.json` file. Both versions need to be updated together to maintain consistency between CI/CD pipeline and project dependencies.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:19:31.295Z
Learning: In the opendatahub-io/notebooks repository, the Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` (format: `mcr.microsoft.com/playwright:v1.53.1-noble`) must always match the `playwright/test` version specified in the `tests/browser/package.json` file. Both versions need to be updated together to maintain consistency between CI/CD pipeline and project dependencies.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1325
File: jupyter/pytorch/ubi9-python-3.12/Pipfile:42-42
Timestamp: 2025-07-09T14:22:14.553Z
Learning: jiridanek requested GitHub issue creation for Pipfile.lock verification script implementation during PR #1325 review, specifically to systematize the manual verification process for dependency version consistency across all lock files using jq. Issue #1367 was created with comprehensive problem description covering manual verification challenges, detailed solution with jq-based verification script, enhanced features for CI integration, clear acceptance criteria, implementation areas breakdown, benefits analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:1-3
Timestamp: 2025-07-03T12:07:19.365Z
Learning: jiridanek consistently requests GitHub issue creation for technical improvements identified during code reviews in opendatahub-io/notebooks, ensuring systematic tracking of code quality enhancements like shell script portability issues with comprehensive descriptions, solution options, and acceptance criteria.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:138-140
Timestamp: 2025-08-01T16:07:58.701Z
Learning: jiridanek prefers architectural solutions that eliminate problems entirely rather than just fixing immediate technical issues. When presented with a pipeline safety concern about micropipenv requirements generation, he suggested removing micropipenv from the build process altogether by using pre-committed requirements.txt files, demonstrating preference for simplification and deterministic builds over complex workarounds.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1600
File: Makefile:0-0
Timestamp: 2025-08-02T08:49:03.735Z
Learning: jiridanek decided to eliminate '+' characters entirely from Makefile target names during PR #1600 review instead of implementing complex substitution workarounds, demonstrating his consistent preference for architectural solutions that eliminate problems at the source rather than adding complexity to handle edge cases.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:17-19
Timestamp: 2025-07-03T14:00:00.909Z
Learning: jiridanek efficiently identifies when CodeRabbit review suggestions are already covered by existing comprehensive issues, demonstrating excellent issue management and avoiding duplicate tracking of the same improvements across multiple locations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1151
File: jupyter/tensorflow/ubi9-python-3.12/test/test_notebook.ipynb:31-34
Timestamp: 2025-07-01T07:03:05.385Z
Learning: jiridanek demonstrates excellent pattern recognition for identifying duplicated code issues across the opendatahub-io/notebooks repository. When spotting a potential problem in test notebooks, he correctly assesses that such patterns are likely replicated across multiple similar files rather than being isolated incidents, leading to more effective systematic solutions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:56-66
Timestamp: 2025-07-02T18:19:49.397Z
Learning: jiridanek consistently creates comprehensive follow-up GitHub issues for security concerns raised during PR reviews in opendatahub-io/notebooks, ensuring systematic tracking and resolution of supply-chain security improvements like GPG signature verification for package repositories.
📚 Learning: trustyai's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with trustyai's visua...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: the jupyter-bokeh package was previously pinned to version 3.0.5 in the trustyai notebook image due ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: during pr #968 review, coderabbit initially incorrectly identified 1 legitimate micropipenv usage in...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:33:47.175Z
Learning: During PR #968 review, CodeRabbit initially incorrectly identified 1 legitimate micropipenv usage in jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda when analyzing Python 3.12 images for unused dependencies. Upon jiridanek's request for re-verification, comprehensive analysis revealed all 15 Python 3.12 Dockerfiles install micropipenv but none actually use it, making the cleanup scope 100% unnecessary installations with no exceptions to handle.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for jupyter-client dependency pinning inconsistency during...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: trustyai explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.tx...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek indicated that the current practice for runtime-images json files in opendatahub-io/notebo...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T16:00:31.637Z
Learning: jiridanek indicated that the current practice for runtime-images JSON files in opendatahub-io/notebooks has changed significantly from the SHA256 digest pinning pattern, and that rebasing PR #1519 would reveal the new practice which is "something completely different" from the existing pattern.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: grdryn corrected coderabbit's false assessment about cuda companion package wheel availability durin...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek corrected coderabbit's false positive about urllib3/requests version incompatibility in pr...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires `urllib3<3,>=1.21.1` (not `urllib3<2` as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: the tensorflow rocm python 3.12 compatibility issue in opendatahub-io/notebooks pr #1259 was caused ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding pipfiles durin...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.
Applied to files:
jupyter/minimal/ubi9-python-3.12/requirements.txt
🔇 Additional comments (1)
jupyter/minimal/ubi9-python-3.12/requirements.txt (1)
324-350
: No action needed—debugpy provides a universal “py2.py3-none-any” wheelThe 1.8.16 release ships a
debugpy-1.8.16-py2.py3-none-any.whl
, so pip will install that on aarch64, ppc64le, s390x, etc., without falling back to source builds. All supported architectures are covered.
rpds-py==0.27.0; python_version >= '3.9' \ | ||
--hash=sha256:010c4843a3b92b54373e3d2291a7447d6c3fc29f591772cc2ea0e9f5c1da434b \ | ||
--hash=sha256:05284439ebe7d9f5f5a668d4d8a0a1d851d16f7d47c78e1fab968c8ad30cab04 \ | ||
--hash=sha256:0665be515767dc727ffa5f74bd2ef60b0ff85dad6bb8f50d91eaa6b5fb226f51 \ | ||
--hash=sha256:069e0384a54f427bd65d7fda83b68a90606a3835901aaff42185fcd94f5a9295 \ | ||
--hash=sha256:08680820d23df1df0a0260f714d12966bc6c42d02e8055a91d61e03f0c47dda0 \ | ||
--hash=sha256:0954e3a92e1d62e83a54ea7b3fdc9efa5d61acef8488a8a3d31fdafbfb00460d \ | ||
--hash=sha256:09965b314091829b378b60607022048953e25f0b396c2b70e7c4c81bcecf932e \ | ||
--hash=sha256:0c431bfb91478d7cbe368d0a699978050d3b112d7f1d440a41e90faa325557fd \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
rpds-py
0.27.0 upgrade – verify Rust / wheel compatibility.
0.27.0 switches to rpds 0.18 and updates pyo3 bindings. In previous upgrades this occasionally broke older CPUs (e.g. POWER9) lacking recent GLIBCXX
symbols.
Please confirm:
- Wheels are published for every architecture/ABI we build.
- No transitive Rust dependency bumps break musl-based downstream images.
🏁 Script executed:
#!/usr/bin/env bash
# Quick wheel presence check for rpds-py 0.27.0
set -euo pipefail
curl -s https://pypi.org/pypi/rpds-py/0.27.0/json |
jq -r '.releases["0.27.0"][].filename' |
grep -E 'cp312.*(x86_64|aarch64|ppc64le|s390x)' |
sort -u
Length of output: 230
🏁 Script executed:
#!/usr/bin/env bash
set -euo pipefail
# List all published versions of rpds-py
echo "Available rpds-py releases:"
curl -s https://pypi.org/pypi/rpds-py/json |
jq -r '.releases | keys[]' |
sort
version="0.27.0"
echo
echo "All distribution filenames for rpds-py $version:"
curl -s https://pypi.org/pypi/rpds-py/$version/json |
jq -r ".releases[\"$version\"][]?.filename" |
sort
echo
echo "Linux manylinux wheel files for $version (x86_64, aarch64, ppc64le, s390x):"
curl -s https://pypi.org/pypi/rpds-py/$version/json |
jq -r ".releases[\"$version\"][]?.filename" |
grep -E 'manylinux.*(x86_64|aarch64|ppc64le|s390x).*\\.whl' |
sort -u
Length of output: 1154
🏁 Script executed:
#!/usr/bin/env bash
set -euo pipefail
for version in 0.27.0 0.26.0; do
echo "=== rpds-py $version distributions ==="
curl -s "https://pypi.org/pypi/rpds-py/$version/json" |
jq -r '.releases["'"$version"'"][] | "\(.filename) \(.packagetype)"' ||
echo " (no releases found)"
echo
done
Length of output: 719
rpds-py 0.27.0 has no prebuilt wheels – source build will be triggered
Our check of https://pypi.org/pypi/rpds-py/0.27.0/json
(and 0.26.0) shows no bdist_wheel
entries in the urls
array, so pip will compile from source:
- Minimal (musl-based) images do not include a Rust toolchain or the C/C++ build deps needed by pyo3→rpds, so the install will fail.
- Even on glibc images, building from source may bump GLIBCXX symbols and break older CPUs.
Action required:
- Either switch back to a version with published manylinux wheels or
- Add the Rust toolchain (cargo, rustc) and necessary musl-gcc/libstdc++ build deps to the Dockerfiles so rpds-py can compile successfully.
🤖 Prompt for AI Agents
In jupyter/minimal/ubi9-python-3.12/requirements.txt around lines 1072 to 1080,
the rpds-py 0.27.0 package lacks prebuilt wheels, causing pip to build from
source which fails on minimal musl-based images without Rust and build
dependencies. To fix this, either revert to a version of rpds-py that provides
manylinux wheels or update the Dockerfiles to install the Rust toolchain (cargo,
rustc) and required musl-gcc/libstdc++ build dependencies so the package can
compile successfully during installation.
2970f8f
to
6aaed8f
Compare
/unhold undone |
/override ci/prow/images |
@jiridanek: Overrode contexts on behalf of jiridanek: ci/prow/images In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. |
/lgtm |
Description
Before:

After:

How Has This Been Tested?
Merge criteria:
Summary by CodeRabbit