Skip to content

NO-JIRA: workflows: allow parallelism with concurrency groups by adding Python version matrix in piplock-renewal.yaml #1714

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 7, 2025

Conversation

jiridanek
Copy link
Member

@jiridanek jiridanek commented Aug 7, 2025

Description

Before:
Screenshot 2025-08-07 at 1 43 28 PM

After:
Screenshot 2025-08-07 at 1 43 36 PM

How Has This Been Tested?

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Summary by CodeRabbit

  • Chores
    • Updated workflow configuration to improve concurrency control for pipfile lock refresh jobs, allowing separate handling per Python version and branch.
    • Upgraded several Python package dependencies to newer versions across multiple environments for improved stability and security.

…ng Python version matrix in `piplock-renewal.yaml`
@jiridanek jiridanek requested a review from atheo89 August 7, 2025 11:45
@jiridanek jiridanek added the tide/merge-method-squash Denotes a PR that should be squashed by tide when it merges. label Aug 7, 2025
@openshift-ci openshift-ci bot requested review from daniellutz and dibryant August 7, 2025 11:45
Copy link
Contributor

coderabbitai bot commented Aug 7, 2025

Warning

Rate limit exceeded

@openshift-ci[bot] has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 15 minutes and 11 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between a821f1a and 2970f8f.

⛔ Files ignored due to path filters (15)
  • codeserver/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/datascience/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/minimal/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/rocm/pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/rocm/tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/trustyai/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/minimal/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/rocm-pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/rocm-tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (10)
  • codeserver/ubi9-python-3.11/requirements.txt (1 hunks)
  • jupyter/datascience/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/minimal/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt (4 hunks)
  • jupyter/pytorch/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/rocm/tensorflow/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/tensorflow/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt (3 hunks)
  • runtimes/datascience/ubi9-python-3.11/requirements.txt (2 hunks)

Walkthrough

The concurrency group configuration in the .github/workflows/piplock-renewal.yaml workflow was updated to include both the Python version and the GitHub reference. Additionally, multiple requirements.txt files across various environments were updated to bump package versions for debugpy (from 1.8.15 to 1.8.16), rpds-py (from 0.26.0 to 0.27.0), and safetensors (from 0.5.3 to 0.6.1), along with their corresponding SHA256 hashes.

Changes

Cohort / File(s) Change Summary
Workflow Concurrency Group Update
.github/workflows/piplock-renewal.yaml
Modified concurrency group key to include matrix.python-version and github.ref for finer concurrency control.
Package Version and Hash Updates in Requirements
codeserver/ubi9-python-3.12/requirements.txt,
jupyter/datascience/ubi9-python-3.12/requirements.txt,
jupyter/minimal/ubi9-python-3.12/requirements.txt,
jupyter/pytorch/ubi9-python-3.12/requirements.txt,
jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt,
jupyter/tensorflow/ubi9-python-3.12/requirements.txt,
jupyter/trustyai/ubi9-python-3.12/requirements.txt,
runtimes/datascience/ubi9-python-3.12/requirements.txt,
runtimes/minimal/ubi9-python-3.12/requirements.txt,
runtimes/pytorch/ubi9-python-3.12/requirements.txt,
runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt,
runtimes/tensorflow/ubi9-python-3.12/requirements.txt
Upgraded debugpy from 1.8.15 to 1.8.16 and rpds-py from 0.26.0 to 0.27.0 with updated SHA256 hashes; additionally, upgraded safetensors from 0.5.3 to 0.6.1 in jupyter/trustyai/ubi9-python-3.12/requirements.txt. No other changes made.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~7 minutes

Possibly related PRs

Suggested labels

size/s

Suggested reviewers

  • paulovmr
  • andyatmiami
✨ Finishing Touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

openshift-ci bot commented Aug 7, 2025

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign atheo89 for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci openshift-ci bot added the size/xs label Aug 7, 2025
@jiridanek
Copy link
Member Author

/hold have to undo the actual piplock renewal

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6aaed8f and a821f1a.

⛔ Files ignored due to path filters (12)
  • codeserver/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/datascience/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/minimal/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/rocm/pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/tensorflow/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/trustyai/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/minimal/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/tensorflow/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (12)
  • codeserver/ubi9-python-3.12/requirements.txt (1 hunks)
  • jupyter/datascience/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/minimal/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/pytorch/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/tensorflow/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/trustyai/ubi9-python-3.12/requirements.txt (3 hunks)
  • runtimes/datascience/ubi9-python-3.12/requirements.txt (2 hunks)
  • runtimes/minimal/ubi9-python-3.12/requirements.txt (2 hunks)
  • runtimes/pytorch/ubi9-python-3.12/requirements.txt (2 hunks)
  • runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt (2 hunks)
  • runtimes/tensorflow/ubi9-python-3.12/requirements.txt (2 hunks)
✅ Files skipped from review due to trivial changes (11)
  • codeserver/ubi9-python-3.12/requirements.txt
  • runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt
  • runtimes/minimal/ubi9-python-3.12/requirements.txt
  • jupyter/tensorflow/ubi9-python-3.12/requirements.txt
  • runtimes/tensorflow/ubi9-python-3.12/requirements.txt
  • runtimes/pytorch/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
  • jupyter/datascience/ubi9-python-3.12/requirements.txt
  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt
  • jupyter/pytorch/ubi9-python-3.12/requirements.txt
🧰 Additional context used
🧠 Learnings (11)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-jupyter-datascience-cpu-py312-ubi9-push.yaml:14-17
Timestamp: 2025-07-11T11:15:47.424Z
Learning: jiridanek requested GitHub issue creation for CEL filter problem in datascience workbench Tekton pipelines during PR #1379 review. Issue #1383 was successfully created with comprehensive problem description covering both Python 3.11 and 3.12 pipelines incorrectly watching jupyter/minimal directories instead of jupyter/datascience directories, detailed impact analysis of pipeline execution failures, complete solution with before/after code examples, thorough acceptance criteria for path updates and pipeline triggering verification, implementation notes about repository structure alignment, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yaml:38-142
Timestamp: 2025-07-11T11:14:26.823Z
Learning: jiridanek requested GitHub issue creation for missing PipelineRun timeout configuration during PR #1379 review, affecting 18 out of 32 total PipelineRun files across both Python 3.11 and 3.12 pipelines. A comprehensive issue was created covering resource monopolization risks, complete affected files analysis, standard 8-hour timeout solution, detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yaml:38-142
Timestamp: 2025-07-11T11:14:26.823Z
Learning: jiridanek requested GitHub issue creation for missing PipelineRun timeout configuration during PR #1379 review, affecting 18 out of 32 total PipelineRun files across both Python 3.11 and 3.12 pipelines. Issue #1380 was successfully created with comprehensive problem description covering resource monopolization risks, complete affected files analysis (18 missing, 14 correct), standard 8-hour timeout solution with code examples, detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T09:12:44.088Z
Learning: jiridanek requested GitHub issue creation for GitHub Actions artifact naming conflict during PR #1357 review, specifically for a failing actions/upload-artifactv4 step with 409 Conflict error. Issue was created with comprehensive problem description covering artifact naming conflicts, root cause analysis of duplicate names in concurrent workflows, four solution options (enhanced naming, overwriting, conditional uploads, matrix-aware naming) with code examples, detailed acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic CI/CD and code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-21T15:06:04.114Z
Learning: jiridanek requested GitHub issue creation for multi-platform dependency locking investigation during PR #1396 review. Issue #1423 was successfully created with comprehensive problem description covering ARM64 wheel availability but being ignored due to AMD64-only dependency locking, root cause analysis of platform-specific pipenv limitations, immediate conditional installation solution, multi-platform locking ecosystem analysis, broader affected areas investigation, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for adding RStudio py311 Tekton push pipelines during PR #1379 review, referencing existing registry entries in manifests/base/params-latest.env but missing corresponding .tekton pipeline files. A comprehensive issue was created with detailed problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:19:31.295Z
Learning: In the opendatahub-io/notebooks repository, the Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` (format: `mcr.microsoft.com/playwright:v1.53.1-noble`) must always match the `playwright/test` version specified in the `tests/browser/package.json` file. Both versions need to be updated together to maintain consistency between CI/CD pipeline and project dependencies.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:19:31.295Z
Learning: In the opendatahub-io/notebooks repository, the Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` (format: `mcr.microsoft.com/playwright:v1.53.1-noble`) must always match the `playwright/test` version specified in the `tests/browser/package.json` file. Both versions need to be updated together to maintain consistency between CI/CD pipeline and project dependencies.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1325
File: jupyter/pytorch/ubi9-python-3.12/Pipfile:42-42
Timestamp: 2025-07-09T14:22:14.553Z
Learning: jiridanek requested GitHub issue creation for Pipfile.lock verification script implementation during PR #1325 review, specifically to systematize the manual verification process for dependency version consistency across all lock files using jq. Issue #1367 was created with comprehensive problem description covering manual verification challenges, detailed solution with jq-based verification script, enhanced features for CI integration, clear acceptance criteria, implementation areas breakdown, benefits analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:1-3
Timestamp: 2025-07-03T12:07:19.365Z
Learning: jiridanek consistently requests GitHub issue creation for technical improvements identified during code reviews in opendatahub-io/notebooks, ensuring systematic tracking of code quality enhancements like shell script portability issues with comprehensive descriptions, solution options, and acceptance criteria.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:138-140
Timestamp: 2025-08-01T16:07:58.701Z
Learning: jiridanek prefers architectural solutions that eliminate problems entirely rather than just fixing immediate technical issues. When presented with a pipeline safety concern about micropipenv requirements generation, he suggested removing micropipenv from the build process altogether by using pre-committed requirements.txt files, demonstrating preference for simplification and deterministic builds over complex workarounds.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1600
File: Makefile:0-0
Timestamp: 2025-08-02T08:49:03.735Z
Learning: jiridanek decided to eliminate '+' characters entirely from Makefile target names during PR #1600 review instead of implementing complex substitution workarounds, demonstrating his consistent preference for architectural solutions that eliminate problems at the source rather than adding complexity to handle edge cases.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:17-19
Timestamp: 2025-07-03T14:00:00.909Z
Learning: jiridanek efficiently identifies when CodeRabbit review suggestions are already covered by existing comprehensive issues, demonstrating excellent issue management and avoiding duplicate tracking of the same improvements across multiple locations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1151
File: jupyter/tensorflow/ubi9-python-3.12/test/test_notebook.ipynb:31-34
Timestamp: 2025-07-01T07:03:05.385Z
Learning: jiridanek demonstrates excellent pattern recognition for identifying duplicated code issues across the opendatahub-io/notebooks repository. When spotting a potential problem in test notebooks, he correctly assesses that such patterns are likely replicated across multiple similar files rather than being isolated incidents, leading to more effective systematic solutions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:56-66
Timestamp: 2025-07-02T18:19:49.397Z
Learning: jiridanek consistently creates comprehensive follow-up GitHub issues for security concerns raised during PR reviews in opendatahub-io/notebooks, ensuring systematic tracking and resolution of supply-chain security improvements like GPG signature verification for package repositories.
📚 Learning: trustyai's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with trustyai's visua...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: the jupyter-bokeh package was previously pinned to version 3.0.5 in the trustyai notebook image due ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: during pr #968 review, coderabbit initially incorrectly identified 1 legitimate micropipenv usage in...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:33:47.175Z
Learning: During PR #968 review, CodeRabbit initially incorrectly identified 1 legitimate micropipenv usage in jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda when analyzing Python 3.12 images for unused dependencies. Upon jiridanek's request for re-verification, comprehensive analysis revealed all 15 Python 3.12 Dockerfiles install micropipenv but none actually use it, making the cleanup scope 100% unnecessary installations with no exceptions to handle.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for jupyter-client dependency pinning inconsistency during...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: trustyai explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.tx...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek indicated that the current practice for runtime-images json files in opendatahub-io/notebo...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T16:00:31.637Z
Learning: jiridanek indicated that the current practice for runtime-images JSON files in opendatahub-io/notebooks has changed significantly from the SHA256 digest pinning pattern, and that rebasing PR #1519 would reveal the new practice which is "something completely different" from the existing pattern.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: grdryn corrected coderabbit's false assessment about cuda companion package wheel availability durin...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek corrected coderabbit's false positive about urllib3/requests version incompatibility in pr...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires `urllib3<3,>=1.21.1` (not `urllib3<2` as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: the tensorflow rocm python 3.12 compatibility issue in opendatahub-io/notebooks pr #1259 was caused ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding pipfiles durin...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Applied to files:

  • jupyter/minimal/ubi9-python-3.12/requirements.txt
🔇 Additional comments (1)
jupyter/minimal/ubi9-python-3.12/requirements.txt (1)

324-350: No action needed—debugpy provides a universal “py2.py3-none-any” wheel

The 1.8.16 release ships a debugpy-1.8.16-py2.py3-none-any.whl, so pip will install that on aarch64, ppc64le, s390x, etc., without falling back to source builds. All supported architectures are covered.

Comment on lines 1072 to 1080
rpds-py==0.27.0; python_version >= '3.9' \
--hash=sha256:010c4843a3b92b54373e3d2291a7447d6c3fc29f591772cc2ea0e9f5c1da434b \
--hash=sha256:05284439ebe7d9f5f5a668d4d8a0a1d851d16f7d47c78e1fab968c8ad30cab04 \
--hash=sha256:0665be515767dc727ffa5f74bd2ef60b0ff85dad6bb8f50d91eaa6b5fb226f51 \
--hash=sha256:069e0384a54f427bd65d7fda83b68a90606a3835901aaff42185fcd94f5a9295 \
--hash=sha256:08680820d23df1df0a0260f714d12966bc6c42d02e8055a91d61e03f0c47dda0 \
--hash=sha256:0954e3a92e1d62e83a54ea7b3fdc9efa5d61acef8488a8a3d31fdafbfb00460d \
--hash=sha256:09965b314091829b378b60607022048953e25f0b396c2b70e7c4c81bcecf932e \
--hash=sha256:0c431bfb91478d7cbe368d0a699978050d3b112d7f1d440a41e90faa325557fd \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

rpds-py 0.27.0 upgrade – verify Rust / wheel compatibility.

0.27.0 switches to rpds 0.18 and updates pyo3 bindings. In previous upgrades this occasionally broke older CPUs (e.g. POWER9) lacking recent GLIBCXX symbols.

Please confirm:

  1. Wheels are published for every architecture/ABI we build.
  2. No transitive Rust dependency bumps break musl-based downstream images.

🏁 Script executed:

#!/usr/bin/env bash
# Quick wheel presence check for rpds-py 0.27.0
set -euo pipefail
curl -s https://pypi.org/pypi/rpds-py/0.27.0/json |
  jq -r '.releases["0.27.0"][].filename' |
  grep -E 'cp312.*(x86_64|aarch64|ppc64le|s390x)' |
  sort -u

Length of output: 230


🏁 Script executed:

#!/usr/bin/env bash
set -euo pipefail

# List all published versions of rpds-py
echo "Available rpds-py releases:"
curl -s https://pypi.org/pypi/rpds-py/json |
  jq -r '.releases | keys[]' |
  sort

version="0.27.0"
echo
echo "All distribution filenames for rpds-py $version:"
curl -s https://pypi.org/pypi/rpds-py/$version/json |
  jq -r ".releases[\"$version\"][]?.filename" |
  sort

echo
echo "Linux manylinux wheel files for $version (x86_64, aarch64, ppc64le, s390x):"
curl -s https://pypi.org/pypi/rpds-py/$version/json |
  jq -r ".releases[\"$version\"][]?.filename" |
  grep -E 'manylinux.*(x86_64|aarch64|ppc64le|s390x).*\\.whl' |
  sort -u

Length of output: 1154


🏁 Script executed:

#!/usr/bin/env bash
set -euo pipefail

for version in 0.27.0 0.26.0; do
  echo "=== rpds-py $version distributions ==="
  curl -s "https://pypi.org/pypi/rpds-py/$version/json" |
    jq -r '.releases["'"$version"'"][] | "\(.filename) \(.packagetype)"' ||
    echo "  (no releases found)"
  echo
done

Length of output: 719


rpds-py 0.27.0 has no prebuilt wheels – source build will be triggered

Our check of https://pypi.org/pypi/rpds-py/0.27.0/json (and 0.26.0) shows no bdist_wheel entries in the urls array, so pip will compile from source:

  • Minimal (musl-based) images do not include a Rust toolchain or the C/C++ build deps needed by pyo3→rpds, so the install will fail.
  • Even on glibc images, building from source may bump GLIBCXX symbols and break older CPUs.

Action required:

  • Either switch back to a version with published manylinux wheels or
  • Add the Rust toolchain (cargo, rustc) and necessary musl-gcc/libstdc++ build deps to the Dockerfiles so rpds-py can compile successfully.
🤖 Prompt for AI Agents
In jupyter/minimal/ubi9-python-3.12/requirements.txt around lines 1072 to 1080,
the rpds-py 0.27.0 package lacks prebuilt wheels, causing pip to build from
source which fails on minimal musl-based images without Rust and build
dependencies. To fix this, either revert to a version of rpds-py that provides
manylinux wheels or update the Dockerfiles to install the Rust toolchain (cargo,
rustc) and required musl-gcc/libstdc++ build dependencies so the package can
compile successfully during installation.

@openshift-ci openshift-ci bot added size/xxl and removed size/xxl labels Aug 7, 2025
@jiridanek jiridanek force-pushed the jd_parallel_locking branch from 2970f8f to 6aaed8f Compare August 7, 2025 12:07
@jiridanek
Copy link
Member Author

/unhold undone

@jiridanek
Copy link
Member Author

/override ci/prow/images
n/a

Copy link
Contributor

openshift-ci bot commented Aug 7, 2025

@jiridanek: Overrode contexts on behalf of jiridanek: ci/prow/images

In response to this:

/override ci/prow/images
n/a

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@atheo89
Copy link
Member

atheo89 commented Aug 7, 2025

/lgtm

@openshift-ci openshift-ci bot added the lgtm label Aug 7, 2025
@jiridanek jiridanek merged commit 2ff6239 into opendatahub-io:main Aug 7, 2025
12 of 18 checks passed
@jiridanek jiridanek deleted the jd_parallel_locking branch August 7, 2025 12:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm size/xs tide/merge-method-squash Denotes a PR that should be squashed by tide when it merges.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants