Skip to content

Optimize and Enable Runtime Datascience Images for IBM Z[s390x] (Python 3.11 & 3.12) #1513

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

Nash-123
Copy link
Contributor

@Nash-123 Nash-123 commented Jul 25, 2025

Summary

This PR introduces s390x (IBM Z) support and build optimizations for the runtime-datascience notebook images for Python 3.11 and 3.12.

  • Reduced build time for s390x: ~2–2.3 hours (down from 4 hours on building natively)
  • No impact on other platforms (x86_64, arm64, etc.)
  • Elyra-compatible: validated for notebook execution

Key Changes

Dockerfile.cpu (Python 3.11 & 3.12)

  • Introduced ARG TARGETARCH for architecture-specific builds
  • Multi-stage build using s390x-builder to build pyarrow from source on IBM Z
  • Enabled cache mounts option for dnf and pip
  • Optimizations for s390x:
    • Installs Rust & Cargo
    • Uses compiler flags (CFLAGS="-O3")
    • Installs development dependencies required for pyarrow compilation
  • Package installation and build logic gated via TARGETARCH

Pipfile (Python 3.11 & 3.12)

Architecture-specific markers added to exclude incompatible packages on s390x:

codeflare-sdk = { version = "~=0.30.0", markers = "platform_machine != 's390x'" }
py-spy        = { version = "~=0.4.0", markers = "platform_machine != 's390x'" }
ray           = { version = "~=2.47.1", markers = "platform_machine != 's390x'", extras = ["data", "default"] }
pyarrow       = { version = "~=21.0.0", markers = "platform_machine != 's390x'" }

Regenerated Pipfile.locks using Renewal Action


Validation & Testing

  • Builds completed successfully on s390x LPAR for Python 3.11 and 3.12
  • Build time reduced to ~2 hours
  • Verified compatibility by deploying on an local openshift cluster on s390x lpar

Sample Notebook Validation :-

Build & Push:

image

Validate Runtime Image(version 3.11):

+ bin/kubectl exec runtime-runtime-pod -- /bin/sh -c 'curl https://raw.githubusercontent.com/opendatahub-io/elyra/refs/heads/main/etc/generic/requirements-elyra.txt --output req.txt && 				python3 -m pip install -r req.txt > /dev/null && 				curl https://raw.githubusercontent.com/nteract/papermill/main/papermill/tests/notebooks/simple_execute.ipynb --output simple_execute.ipynb && 				python3 -m papermill simple_execute.ipynb output.ipynb > /dev/null'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1368  100  1368    0     0   7907      0 --:--:-- --:--:-- --:--:--  7907

[notice] A new release of pip is available: 24.2 -> 25.1.1
[notice] To update, run: pip install --upgrade pip
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   910  100   910    0     0   3791      0 --:--:-- --:--:-- --:--:--  3791
Input Notebook:  simple_execute.ipynb
Output Notebook: output.ipynb
Executing:   0%|          | 0/3 [00:00<?, ?cell/s]Executing notebook with kernel: python3
Executing DESC: 100%|██████████| 3/3 [00:01<00:00,  2.04cell/s]
+ '[' 0 -eq 1 ']'
+ echo '=> Container image runtime-datascience-ubi9-python-3.11 is a suitable Elyra runtime image'
=> Container image runtime-datascience-ubi9-python-3.11 is a suitable Elyra runtime image

Validate Runtime Image(version 3.12):

=> Checking container image runtime-datascience-ubi9-python-3.12 for python3...
+ bin/kubectl exec runtime-pod which python3
+ '[' python3 == python3 ']'
+ echo '=> Checking notebook execution...'
=> Checking notebook execution...
+ bin/kubectl exec runtime-pod -- /bin/sh -c 'curl https://raw.githubusercontent.com/opendatahub-io/elyra/refs/heads/main/etc/generic/requirements-elyra.txt --output req.txt && 				python3 -m pip install -r req.txt > /dev/null && 				curl https://raw.githubusercontent.com/nteract/papermill/main/papermill/tests/notebooks/simple_execute.ipynb --output simple_execute.ipynb && 				python3 -m papermill simple_execute.ipynb output.ipynb > /dev/null'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1368  100  1368    0     0   6705      0 --:--:-- --:--:-- --:--:--  6738

[notice] A new release of pip is available: 24.2 -> 25.1.1
[notice] To update, run: pip install --upgrade pip
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   910  100   910    0     0   3939      0 --:--:-- --:--:-- --:--:--  3922
Input Notebook:  simple_execute.ipynb
Output Notebook: output.ipynb
Executing:   0%|          | 0/3 [00:00<?, ?cell/s]Executing notebook with kernel: python3
Executing DESC: 100%|██████████| 3/3 [00:01<00:00,  2.09cell/s]
+ '[' 0 -eq 1 ']'
+ echo '=> Container image runtime-datascience-ubi9-python-3.12 is a suitable Elyra runtime image'
=> Container image runtime-datascience-ubi9-python-3.12 is a suitable Elyra runtime image

c.c: @jiridanek

Summary by CodeRabbit

  • New Features

    • Added support for building and running datascience runtimes on the s390x architecture for Python 3.11 and 3.12 images.
    • Introduced architecture-specific package installation and build steps, including custom handling for pyarrow and Rust tooling on s390x.
  • Bug Fixes

    • Excluded incompatible packages (codeflare-sdk, py-spy, ray, and pyarrow) from installation on s390x to prevent build and runtime issues.
  • Chores

    • Improved build caching and permissions handling for better performance and compatibility with OpenShift environments.
    • Updated build configuration to recognize new s390x-compatible runtimes.
    • Updated package installation conditions to exclude s390x architecture, enhancing compatibility and build reliability.
    • Bumped versions of select Python packages in Jupyter requirements for improved stability and security.

@openshift-ci openshift-ci bot requested review from caponetto and paulovmr July 25, 2025 21:39
Copy link
Contributor

openshift-ci bot commented Jul 25, 2025

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign andyatmiami for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

Copy link
Contributor

coderabbitai bot commented Jul 25, 2025

Walkthrough

This change introduces architecture-aware logic to the datascience runtime Dockerfiles and Pipfiles for both Python 3.11 and 3.12. Conditional build steps, package installations, and environment variable settings are now applied based on the TARGETARCH build argument, with special handling for the s390x architecture. The build matrix is also updated to reflect s390x compatibility. Additionally, requirements files and Jupyter environment dependencies are updated with platform-specific installation markers and package version bumps.

Changes

Cohort / File(s) Change Summary
Python 3.11 Dockerfile & Pipfile updates
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu, runtimes/datascience/ubi9-python-3.11/Pipfile, runtimes/datascience/ubi9-python-3.11/requirements.txt
Adds architecture-aware logic for package installation, Rust setup, and wheel building. Excludes several Python packages from s390x builds via Pipfile and requirements.txt markers by replacing Python version conditions with platform machine checks.
Python 3.12 Dockerfile, Pipfile & requirements updates
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu, runtimes/datascience/ubi9-python-3.12/Pipfile, runtimes/datascience/ubi9-python-3.12/requirements.txt
Introduces s390x-specific build stage, conditional package and Rust installation, platform markers in Pipfile and requirements.txt to exclude incompatible packages on s390x, and updates package version conditions from Python version to platform machine.
Build matrix update
ci/cached-builds/gen_gha_matrix_jobs.py
Adds new datascience runtime targets for Python 3.11 and 3.12 to the s390x-compatible set.
Jupyter environment package version bumps
jupyter/trustyai/ubi9-python-3.11/requirements.txt, jupyter/trustyai/ubi9-python-3.12/requirements.txt, jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
Updates huggingface-hub, safetensors, and openai package versions and their hashes in Jupyter environment requirements files.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~15 minutes

Possibly related issues

Possibly related PRs

Suggested labels

lgtm, size/l

Suggested reviewers

  • atheo89
  • jiridanek

Note

🔌 MCP (Model Context Protocol) integration is now available in Early Access!

Pro users can now connect to remote MCP servers under the Integrations page to get reviews and chat conversations that understand additional development context.

✨ Finishing Touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

openshift-ci bot commented Jul 25, 2025

Hi @Nash-123. Thanks for your PR.

I'm waiting for a opendatahub-io member to verify that this patch is reasonable to test. If it is, they should reply with /ok-to-test on its own line. Until that is done, I will not automatically test new commits in this PR, but the usual testing commands by org members will still work. Regular contributors should join the org to skip this step.

Once the patch is verified, the new status will be reflected by the ok-to-test label.

I understand the commands that are listed here.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@@ -435,7 +435,7 @@
--hash=sha256:f26b383144cf2d2c29f01a1e8170f50dacf0eac02d64139dcd709a8ac4eb3cfe \
--hash=sha256:f939a054192ddc596e031e50bb13b657ce318cf13d264f095ce9db7dc6ae81c0 \
--hash=sha256:fd93cc7f3139b6dd7aab2f26a90dde0aa9fc264dbf70f6740d498a70b860b82c
cryptography==43.0.3; python_version >= '3.7' \
cryptography==43.0.3; platform_machine != 's390x' \

Check notice

Code scanning / Trivy

openssl: RFC7250 handshakes with unauthenticated servers don't abort as expected Low library

Package: cryptography
Installed Version: 43.0.3
Vulnerability CVE-2024-12797
Severity: LOW
Fixed Version: 44.0.1
Link: CVE-2024-12797
@openshift-ci openshift-ci bot added size/xxl and removed size/xxl labels Jul 25, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 13

🔭 Outside diff range comments (1)
runtimes/datascience/ubi9-python-3.12/requirements.txt (1)

438-465: Upgrade cryptography to ≥ 44.0.1 – CVE-2024-12797
Trivy flags 43.0.3 as vulnerable (RFC7250 handshake bypass). The fix is available in 44.0.1.

-cryptography==43.0.3; platform_machine != 's390x' \
+cryptography==44.0.1; platform_machine != 's390x' \
🧹 Nitpick comments (6)
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (2)

26-37: Rust bootstrap script executed without checksum or sig-verification.

curl https://sh.rustup.rs | sh is convenient but opaque. To harden the image:

  1. Download the script and verify SHA256 (Rust publishes checksums).
  2. Pin to a known version with RUSTUP_INIT env or --default-toolchain flag.

At minimum, store the expected digest in the Dockerfile and fail the build on mismatch.


130-143: Wheel copy step fails the build for non-s390x runs.

COPY --from=s390x-builder /tmp/wheels /tmp/wheels copies even when the directory is empty (non-s390x build).
That is fine, but the subsequent find will echo “No pyarrow wheel found!” only when $WHEELS is empty inside the if branch. Because the if branch runs only on s390x, non-s390x builds silently keep the empty directory around.

Add rm -rf /tmp/wheels also in the non-s390x path to avoid leaving junk layers.

jupyter/trustyai/ubi9-python-3.12/requirements.txt (1)

918-920: Confirm huggingface-hub 0.34.1 doesn’t outrun pinned transformers
transformers==4.52.4 currently requires huggingface-hub<1.0. The new 0.34.x branch is still compatible, but keep an eye on future minor bumps—to 1.0 the constraint will tighten.

runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (3)

26-40: Persist Rust tooling in PATH via ENV, not login-shell profiles.

/etc/profile.d/cargo.sh is sourced only for login shells, so non-interactive processes (e.g. pip building wheels later in the Dockerfile) won’t see /opt/.cargo/bin.
Add an explicit ENV so every layer inherits the correct PATH/CARGO_HOME.

 RUN if [ "$TARGETARCH" = "s390x" ]; then \
@@
     echo 'export GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1' >> /etc/profile.d/cargo.sh; \
 fi
+# Ensure non-login shells see Rust binaries
+ENV PATH=/opt/.cargo/bin:$PATH \
+    CARGO_HOME=/opt/.cargo \
+    GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1

52-55: pip cache mount points should be writable by the current user.

The cache target is /root/.cache/pip while the active user is 1001; this will be owned by root and may become read-only for the non-root user, defeating the cache and occasionally failing writes.
Consider either switching back to USER 0 just for this instruction or changing the mount target to $HOME/.cache/pip.


153-160: Consider running micropipenv install as non-root to avoid FS permission fixes.

Because USER 0 is still active here, the subsequent chmod -R g+w dance is required.
If you adopt the previous suggestion to reset to USER 1001 before this RUN step, you can drop the manual permission fix and reduce layer size/time.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9d2bac1 and 8053c7e.

⛔ Files ignored due to path filters (12)
  • codeserver/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/datascience/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/minimal/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/rocm/pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/tensorflow/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • jupyter/trustyai/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/minimal/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/tensorflow/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (16)
  • codeserver/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/datascience/ubi9-python-3.12/requirements.txt (4 hunks)
  • jupyter/minimal/ubi9-python-3.12/requirements.txt (2 hunks)
  • jupyter/pytorch/ubi9-python-3.12/requirements.txt (4 hunks)
  • jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt (4 hunks)
  • jupyter/tensorflow/ubi9-python-3.12/requirements.txt (5 hunks)
  • jupyter/trustyai/ubi9-python-3.12/requirements.txt (4 hunks)
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (2 hunks)
  • runtimes/datascience/ubi9-python-3.11/Pipfile (1 hunks)
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (2 hunks)
  • runtimes/datascience/ubi9-python-3.12/Pipfile (1 hunks)
  • runtimes/datascience/ubi9-python-3.12/requirements.txt (24 hunks)
  • runtimes/minimal/ubi9-python-3.12/requirements.txt (1 hunks)
  • runtimes/pytorch/ubi9-python-3.12/requirements.txt (3 hunks)
  • runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt (3 hunks)
  • runtimes/tensorflow/ubi9-python-3.12/requirements.txt (4 hunks)
🧰 Additional context used
🧠 Learnings (17)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py:619-626
Timestamp: 2025-07-08T19:33:14.340Z
Learning: jiridanek requested GitHub issue creation for Python 3.12 version check bug in bootstrapper.py during PR #1333 review. Issue #1348 was created with comprehensive problem description covering version check exclusion affecting all Python 3.12 runtime images, detailed impact analysis of bootstrapper execution failures, clear solution with code examples, affected files list including all 6 runtime bootstrapper copies, acceptance criteria for testing and verification, implementation notes about code duplication and upstream reporting, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
runtimes/minimal/ubi9-python-3.12/requirements.txt (3)

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (17)

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.

Learnt from: atheo89
PR: #1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: grdryn
PR: #1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1364 was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Learnt from: jiridanek
PR: #1333
File: runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda:17-25
Timestamp: 2025-07-09T08:07:30.628Z
Learning: jiridanek requested GitHub issue creation for oc client installation permission problem in PyTorch CUDA runtime Dockerfile during PR #1333 review. Issue #1356 was created with comprehensive problem description covering USER 1001 permission conflicts with root-owned /opt/app-root/bin directory, detailed impact analysis of build failures and non-executable binaries, current problematic code snippet, complete solution with user switching approach, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

codeserver/ubi9-python-3.12/requirements.txt (10)

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

jupyter/trustyai/ubi9-python-3.12/requirements.txt (15)

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

jupyter/datascience/ubi9-python-3.12/requirements.txt (18)

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py:619-626
Timestamp: 2025-07-08T19:33:14.340Z
Learning: jiridanek requested GitHub issue creation for Python 3.12 version check bug in bootstrapper.py during PR #1333 review. Issue #1348 was created with comprehensive problem description covering version check exclusion affecting all Python 3.12 runtime images, detailed impact analysis of bootstrapper execution failures, clear solution with code examples, affected files list including all 6 runtime bootstrapper copies, acceptance criteria for testing and verification, implementation notes about code duplication and upstream reporting, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1379
File: .tekton/odh-workbench-jupyter-datascience-cpu-py312-ubi9-push.yaml:14-17
Timestamp: 2025-07-11T11:15:47.424Z
Learning: jiridanek requested GitHub issue creation for CEL filter problem in datascience workbench Tekton pipelines during PR #1379 review. Issue #1383 was successfully created with comprehensive problem description covering both Python 3.11 and 3.12 pipelines incorrectly watching jupyter/minimal directories instead of jupyter/datascience directories, detailed impact analysis of pipeline execution failures, complete solution with before/after code examples, thorough acceptance criteria for path updates and pipeline triggering verification, implementation notes about repository structure alignment, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

jupyter/minimal/ubi9-python-3.12/requirements.txt (13)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.

runtimes/pytorch/ubi9-python-3.12/requirements.txt (16)

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt (15)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

runtimes/datascience/ubi9-python-3.11/Pipfile (14)

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with extras = ["and-cuda"] can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

runtimes/tensorflow/ubi9-python-3.12/requirements.txt (17)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-08T19:29:32.006Z
Learning: jiridanek requested GitHub issue creation for investigating TensorFlow "and-cuda" extras usage patterns during PR #1333 review. Issue #1340 was created with comprehensive investigation framework covering platform-specific analysis, deployment scenarios, TensorFlow version compatibility, clear acceptance criteria, testing approach, and implementation timeline, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

jupyter/tensorflow/ubi9-python-3.12/requirements.txt (16)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (20)

Learnt from: atheo89
PR: #1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: #1333
File: runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda:17-25
Timestamp: 2025-07-09T08:07:30.628Z
Learning: jiridanek requested GitHub issue creation for oc client installation permission problem in PyTorch CUDA runtime Dockerfile during PR #1333 review. Issue #1356 was created with comprehensive problem description covering USER 1001 permission conflicts with root-owned /opt/app-root/bin directory, detailed impact analysis of build failures and non-executable binaries, current problematic code snippet, complete solution with user switching approach, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

runtimes/datascience/ubi9-python-3.12/Pipfile (15)

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with extras = ["and-cuda"] can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

runtimes/datascience/ubi9-python-3.12/requirements.txt (30)

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with extras = ["and-cuda"] can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Learnt from: jiridanek
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Learnt from: jiridanek
PR: #1197
File: runtimes/minimal/ubi9-python-3.11/requirements.txt:395-405
Timestamp: 2025-06-26T15:28:35.416Z
Learning: psutil version 7.x is compatible with UBI9, CentOS Stream 9, and RHEL 9 platforms in the opendatahub-io/notebooks repository. The upgrade from psutil 5.x to 7.x has been validated for these environments.

Learnt from: jiridanek
PR: #1463
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:15-15
Timestamp: 2025-07-22T14:31:20.315Z
Learning: PyTorch 2.6.0 wheels with CUDA 12.4 support (torch-2.6.0+cu124) are available on the official PyTorch CUDA index at https://download.pytorch.org/whl/cu124/ for Python versions 3.9, 3.10, 3.11, 3.12, and 3.13 on both Linux x86_64 and Windows AMD64 platforms.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Learnt from: jiridanek
PR: #1325
File: jupyter/datascience/ubi9-python-3.11/Pipfile:32-32
Timestamp: 2025-07-07T14:54:57.907Z
Learning: odh-elyra uses universal wheels (py3-none-any) rather than platform-specific wheels, making it compatible with all supported Python versions through a single wheel file.

jupyter/pytorch/ubi9-python-3.12/requirements.txt (17)

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt (15)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

🪛 Checkov (3.2.334)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[MEDIUM] 14-24: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[MEDIUM] 14-23: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

🪛 GitHub Actions: Code static analysis
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[warning] 27-27: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.


[warning] 69-69: Hadolint DL3002 warning: Last USER should not be root.


[warning] 139-139: Hadolint DL3002 warning: Last USER should not be root.

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[warning] 26-26: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.


[warning] 64-64: Hadolint DL3002 warning: Last USER should not be root.


[warning] 128-128: Hadolint DL3002 warning: Last USER should not be root.

🪛 GitHub Check: Trivy
runtimes/datascience/ubi9-python-3.12/requirements.txt

[notice] 438-438: openssl: RFC7250 handshakes with unauthenticated servers don't abort as expected
Package: cryptography
Installed Version: 43.0.3
Vulnerability CVE-2024-12797
Severity: LOW
Fixed Version: 44.0.1
Link: CVE-2024-12797

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (20)
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
🔇 Additional comments (33)
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (1)

90-101: Hard-coding -march=z14 may break forward compatibility.

Compiling C/C++ with -march=z14 means binaries will refuse to run on pre-z14 hardware and may miss newer ISA extensions on z15/z16.
If you only need z14 baseline, add -mtune=z14 without -march and drop the explicit -march flag.
Alternatively detect TARGETARCH/CPU_MODEL at runtime and set the least-common-denominator.

runtimes/minimal/ubi9-python-3.12/requirements.txt (1)

307-309: comm bump to 0.2.3 looks correct.

The new version remains within the compatible range required by ipykernel (>=0.1.3) and ipython (>=0.1.0). Hashes match the files on PyPI. No further action needed.

codeserver/ubi9-python-3.12/requirements.txt (2)

429-431: narwhals 1.48.1 universal wheel available – no action required

A pure-Python universal wheel (narwhals-1.48.1-py3-none-any.whl) and a source archive are published on PyPI, so all three build platforms (x86_64, s390x, ppc64le, etc.) will install the wheel without triggering a source build. The micro-bump is safe to merge as-is.


116-118: comm==0.2.3 pin resolves cleanly with no conflicts

The dry-run pip install succeeded:

Would install comm-0.2.3 debugpy-1.8.15 ipykernel-6.29.5 jupyter_client-8.6.3 jupyter_core-5.8.1 nest_asyncio-1.6.0 psutil-7.0.0 python-dateutil-2.9.0.post0 pyzmq-27.0.0 six-1.17.0 tornado-6.5.1

No version clashes were reported. You can safely proceed with the bump.

jupyter/minimal/ubi9-python-3.12/requirements.txt (2)

469-471: Patch-level upgrade of GitPython (3.1.44 ➜ 3.1.45) safely fits the existing smmap==5.0.2 constraint.

GitPython 3.1.45 still advertises the requirement smmap>=3.0.1,<6, so the current pin is valid. No further action required.


316-318: Replace <new-image> placeholder and re-run the runtime smoke test

It looks like the initial check didn’t run because the <new-image> placeholder wasn’t replaced with your built image tag. Please rerun the following (substituting your actual image name) to confirm the bump to comm==0.2.3 works correctly with ipykernel:

#!/bin/bash
docker run --rm <your-image-name> python - <<'PY'
import comm, ipykernel, sys
print("comm", comm.__version__)
print("ipykernel", ipykernel.__version__)
sys.exit(0 if comm.__version__ == "0.2.3" else 1)
PY

• If that passes, manually launch a notebook in JupyterLab to ensure kernels start and cells execute as expected.
• Otherwise, please report any failures here.

runtimes/pytorch/ubi9-python-3.12/requirements.txt (3)

381-383: comm patch-level bump looks good

Nothing else in the stack pins an earlier comm version, so upgrading to 0.2.3 should be a safe, drop-in replacement.


1191-1193: narwhals micro-bump acknowledged

1.48.1 contains only bug-fixes; no issues foreseen.


687-738: Verify s390x support for grpcio==1.74.0

We ran the following check and confirmed there are no pre-built s390x wheels for grpcio 1.74.0, so it will be compiled from source on s390x:

# Verify no s390x wheels exist
curl -s https://pypi.org/pypi/grpcio/1.74.0/json |
  jq -r '.releases["1.74.0"][] | .filename' | grep -i 's390x' || echo "No s390x wheels"
# ⇒ No s390x wheels

Please confirm:

  • The s390x-builder stage in runtimes/pytorch/ubi9-python-3.12/Dockerfile installs a toolchain (clang, cmake, openssl-devel, etc.) that meets grpcio 1.74.0’s minimum CMake requirement.
  • The full s390x image build still completes within the ~2 h window.

If you encounter build failures or a significant time regression, consider pinning grpcio to the last known working 1.51.* release behind a TARGETARCH conditional.

jupyter/trustyai/ubi9-python-3.12/requirements.txt (3)

431-433: comm 0.2.3 bump looks fine
Pure-Python patch release, no ABI/binary wheels to worry about. 👍


848-899: Check grpcio==1.74.0 for non-x86 wheels before merging
grpcio still publishes wheels only for x86_64 & aarch64. Builds from source require Bazel + C++ toolchain, which will fail inside the minimal UBI9 image and especially on s390x where no wheel exists.

If the s390x build path introduced elsewhere in this PR does not exclude/install-from-source successfully, the image will break.

-grpcio==1.74.0; python_version >= '3.9'
+# Exclude on s390x until upstream ships a wheel
+grpcio==1.74.0; python_version >= '3.9' and platform_machine != "s390x"

1484-1486: narwhals patch update (1.48.0 → 1.48.1) is trivial
No breaking changes reported; safe to ship.

jupyter/pytorch/ubi9-python-3.12/requirements.txt (3)

431-433: comm bump to 0.2.3 is in-sync with IPython 9.4 / JupyterLab 4.2.
The new ipykernel stack expects ≥0.2.3, so the upgrade removes a latent mismatch.


1434-1436: narwhals patch release (1.48.1) ok.
No breaking API changes; only pandas-compat fixes.


789-791: No direct GitPython sub-command usage found
I ran a comprehensive search for GitPython API calls (.git.checkout, .git.reset, .git.update, .git.pull, .git.push, .git.branch, .git.commit as well as repo.git(...)) across all Python files and found no occurrences. The bump to 3.1.45 therefore has no direct impact on existing code.

Please manually verify before merging:

  • Any dynamic calls like repo.git(...) that may not match the above patterns.
  • Shell scripts, notebook cells or other image-level scripts invoking git via subprocess.

Once those are confirmed, upgrading to GitPython 3.1.45 is safe.

jupyter/datascience/ubi9-python-3.12/requirements.txt (3)

427-429: Upgrade of comm is fine – ✅

Minor patch bump from 0.2.2 → 0.2.3 is fully compatible with the current
ipywidgets 8.1.x / ipykernel 6.30.x stack.
Nothing to do here.


785-787: gitpython bump looks good

3.1.45 only contains maintenance fixes and a CVE-free changelog.
No breaking API changes for downstream users detected.


1424-1426: narwhals micro-upgrade is safe

1.48.1 only adjusts pyarrow compatibility metadata – no behavioural
changes. 👍

jupyter/tensorflow/ubi9-python-3.12/requirements.txt (3)

433-435: Version bump to comm 0.2.3 LGTM

Straightforward micro-release; no downstream incompatibilities are known.


797-799: Update to GitPython 3.1.45 looks good

The release only contains bug & security fixes; API is unchanged, so the upgrade is safe.


1509-1511: narwhals micro-upgrade to 1.48.1 is fine

No native extensions are involved; safe to merge.

runtimes/tensorflow/ubi9-python-3.12/requirements.txt (3)

383-385: Minor library bump looks safe

comm 0.2.3 is a pure-Python patch release with only small bug-fixes; no follow-up action needed.


1269-1271: Patch-level narwhals upgrade LGTM

narwhals 1.48.1 is a micro-release; changelog shows only typo & doc fixes. Safe to merge.


1418-1509: optree 0.17.0 may be a breaking change – validate downstream consumers

optree jumps from 0.16 → 0.17 and the release notes mention API surface tweaks (tree_flatten_one_level, renamed constants).
While this TensorFlow image doesn’t bundle JAX, packages such as jaxopt, equinox, optax, or flax (if installed by users) import optree. An incompatible bump will surface as runtime ImportError.

Action: run the unit-/integration tests or at least import-smoke any packages that depend on optree in this image. If breakage appears, add an upper-bound (<0.17) until the stack is rebuilt.

runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt (3)

381-383: Minor bump looks safe

comm 0.2.3 is a patch-level upgrade; no known breaking changes.
No further action required.


1191-1193: Patch upgrade acknowledged

narwhals 1.48.1 is a tiny bug-fix over 1.48.0. No compatibility issues expected.


687-738: ⚠️ Manual check required: grpcio==1.74.0 cp312 manylinux wheel availability

CI attempts to fetch metadata from PyPI failed (SSL/network issues), so we can’t confirm whether grpcio 1.74.0 includes pre-built wheels for Python 3.12 on manylinux.
Please manually verify on PyPI that grpcio 1.74.0 provides cp312-*manylinux* wheels.

• File runtimes/rocm-pytorch/ubi9-python-3.12/requirements.txt, around line 687: grpcio==1.74.0

If those wheels are missing, consider pinning back to 1.73.1 or adding the necessary build dependencies to avoid a source build fallback.

jupyter/rocm/pytorch/ubi9-python-3.12/requirements.txt (3)

431-433: comm bumped to v0.2.3 – looks good

The new version is still fully compatible with ipykernel 6.30.x/ipython 9.x already present in this environment.
No action required.


789-791: GitPython 3.1.45 brings CVE fixes – solid update

The upgrade is a drop-in replacement with no breaking API changes; 👍 for picking up the security patch.


1435-1437: narwhals micro-bump to 1.48.1 – no concerns

Patch release contains only bug-fixes; safe to adopt.

runtimes/datascience/ubi9-python-3.12/requirements.txt (2)

1620-1663: Confirm pyarrow presence for s390x runtime
pyarrow is excluded here (platform_machine != 's390x') but later required in notebooks.
Ensure the Dockerfile explicitly installs the pre-built wheel produced in the s390x-builder stage; otherwise users on IBM Z will hit ModuleNotFoundError: pyarrow.

If not already done, add in the runtime stage after requirements:

COPY --from=s390x-builder /tmp/pyarrow*.whl /tmp/
RUN python -m pip install --no-deps /tmp/pyarrow*.whl

2318-2320: six==1.17.0 is available on PyPI – no change needed

A check against the PyPI API confirms that version 1.17.0 of the six package is published and is the current latest release. The existing line in runtimes/datascience/ubi9-python-3.12/requirements.txt is therefore correct and does not need to be downgraded to 1.16.0.

Likely an incorrect or invalid review comment.

runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (1)

57-60: Use ${TARGETARCH} mapping instead of $(uname -m) when downloading oc.

$(uname -m) returns aarch64 on ARM, but the OpenShift mirror expects arm64; similar mismatches broke multi-arch images before (see issue #1363).
A simple case statement keeps downloads working on all targets, including the new s390x.

-RUN curl -L https://mirror.openshift.com/pub/openshift-v4/$(uname -m)/clients/ocp/stable/openshift-client-linux.tar.gz \
+RUN arch_dir="$TARGETARCH" && \
+    [ "$arch_dir" = "amd64" ] && arch_dir="x86_64" || true && \
+    curl -L https://mirror.openshift.com/pub/openshift-v4/${arch_dir}/clients/ocp/stable/openshift-client-linux.tar.gz \
⛔ Skipped due to learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1364 was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Comment on lines 14 to 24
RUN --mount=type=cache,target=/var/cache/dnf \
echo "Building for architecture: ${TARGETARCH}" && \
PACKAGES="mesa-libGL skopeo libxcrypt-compat" && \
# Additional dev tools only for s390x
if [ "$TARGETARCH" = "s390x" ]; then \
PACKAGES="$PACKAGES gcc gcc-c++ make openssl-devel autoconf automake libtool cmake python3-devel pybind11-devel openblas-devel unixODBC-devel openssl c-ares-devel zlib-devel"; \
fi && \
if [ -n "$PACKAGES" ]; then \
dnf install -y --nogpgcheck --allowerasing --nobest $PACKAGES && \
dnf clean all && rm -rf /var/cache/yum; \
fi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Avoid --nogpgcheck; enforce signature verification (or justify why it must be disabled).

Using --nogpgcheck disables RPM signature validation for all packages, which weakens supply-chain security and is exactly what CKV2_DOCKER_9 flags here.
If the Red Hat repos really contain unsigned RPMs, add --repo filters limited to the affected repos or explicitly import the required GPG keys instead of disabling verification globally.

🧰 Tools
🪛 Checkov (3.2.334)

[MEDIUM] 14-24: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu around lines 14 to 24,
the use of --nogpgcheck disables RPM signature verification for all packages,
weakening security. To fix this, remove the --nogpgcheck flag and ensure the
necessary GPG keys are imported before installing packages. If some repos have
unsigned RPMs, limit the no-gpg-check option to those specific repos using
--repo filters instead of disabling verification globally.

Comment on lines +66 to +119
FROM base AS s390x-builder

ARG TARGETARCH
USER 0
WORKDIR /tmp/build-wheels

# Build pyarrow optimized for s390x
RUN --mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/dnf \
if [ "$TARGETARCH" = "s390x" ]; then \
# Install build dependencies (shared for pyarrow and onnx)
dnf install -y cmake make gcc-c++ pybind11-devel wget && \
dnf clean all && \
# Build and collect pyarrow wheel
git clone --depth 1 https://github.com/apache/arrow.git && \
cd arrow/cpp && \
mkdir release && cd release && \
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DARROW_PYTHON=ON \
-DARROW_PARQUET=ON \
-DARROW_ORC=ON \
-DARROW_FILESYSTEM=ON \
-DARROW_JSON=ON \
-DARROW_CSV=ON \
-DARROW_DATASET=ON \
-DARROW_DEPENDENCY_SOURCE=BUNDLED \
-DARROW_WITH_LZ4=OFF \
-DARROW_WITH_ZSTD=OFF \
-DARROW_WITH_SNAPPY=OFF \
-DARROW_BUILD_TESTS=OFF \
-DARROW_BUILD_BENCHMARKS=OFF \
.. && \
make -j$(nproc) VERBOSE=1 && \
make install -j$(nproc) && \
cd ../../python && \
pip install --no-cache-dir -r requirements-build.txt && \
PYARROW_WITH_PARQUET=1 \
PYARROW_WITH_DATASET=1 \
PYARROW_WITH_FILESYSTEM=1 \
PYARROW_WITH_JSON=1 \
PYARROW_WITH_CSV=1 \
PYARROW_PARALLEL=$(nproc) \
python setup.py build_ext --build-type=release --bundle-arrow-cpp bdist_wheel && \
mkdir -p /tmp/wheels && \
cp dist/pyarrow-*.whl /tmp/wheels/ && \
# Ensure wheels directory exists and has content
ls -la /tmp/wheels/; \
else \
# Create empty wheels directory for non-s390x
mkdir -p /tmp/wheels; \
fi

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

git is missing – arrow clone will fail on s390x.

The builder installs cmake make gcc-c++ pybind11-devel wget but forgets git, yet immediately executes git clone (line 80).
Without git, the build will error out early and pyarrow wheels won’t be produced.

-        dnf install -y cmake make gcc-c++ pybind11-devel wget && \
+        dnf install -y cmake make gcc-c++ git pybind11-devel wget && \
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
FROM base AS s390x-builder
ARG TARGETARCH
USER 0
WORKDIR /tmp/build-wheels
# Build pyarrow optimized for s390x
RUN --mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/dnf \
if [ "$TARGETARCH" = "s390x" ]; then \
# Install build dependencies (shared for pyarrow and onnx)
dnf install -y cmake make gcc-c++ pybind11-devel wget && \
dnf clean all && \
# Build and collect pyarrow wheel
git clone --depth 1 https://github.com/apache/arrow.git && \
cd arrow/cpp && \
mkdir release && cd release && \
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DARROW_PYTHON=ON \
-DARROW_PARQUET=ON \
-DARROW_ORC=ON \
-DARROW_FILESYSTEM=ON \
-DARROW_JSON=ON \
-DARROW_CSV=ON \
-DARROW_DATASET=ON \
-DARROW_DEPENDENCY_SOURCE=BUNDLED \
-DARROW_WITH_LZ4=OFF \
-DARROW_WITH_ZSTD=OFF \
-DARROW_WITH_SNAPPY=OFF \
-DARROW_BUILD_TESTS=OFF \
-DARROW_BUILD_BENCHMARKS=OFF \
.. && \
make -j$(nproc) VERBOSE=1 && \
make install -j$(nproc) && \
cd ../../python && \
pip install --no-cache-dir -r requirements-build.txt && \
PYARROW_WITH_PARQUET=1 \
PYARROW_WITH_DATASET=1 \
PYARROW_WITH_FILESYSTEM=1 \
PYARROW_WITH_JSON=1 \
PYARROW_WITH_CSV=1 \
PYARROW_PARALLEL=$(nproc) \
python setup.py build_ext --build-type=release --bundle-arrow-cpp bdist_wheel && \
mkdir -p /tmp/wheels && \
cp dist/pyarrow-*.whl /tmp/wheels/ && \
# Ensure wheels directory exists and has content
ls -la /tmp/wheels/; \
else \
# Create empty wheels directory for non-s390x
mkdir -p /tmp/wheels; \
fi
FROM base AS s390x-builder
ARG TARGETARCH
USER 0
WORKDIR /tmp/build-wheels
# Build pyarrow optimized for s390x
RUN --mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/dnf \
if [ "$TARGETARCH" = "s390x" ]; then \
# Install build dependencies (shared for pyarrow and onnx)
dnf install -y cmake make gcc-c++ git pybind11-devel wget && \
dnf clean all && \
# Build and collect pyarrow wheel
git clone --depth 1 https://github.com/apache/arrow.git && \
cd arrow/cpp && \
mkdir release && cd release && \
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DARROW_PYTHON=ON \
-DARROW_PARQUET=ON \
-DARROW_ORC=ON \
-DARROW_FILESYSTEM=ON \
-DARROW_JSON=ON \
-DARROW_CSV=ON \
-DARROW_DATASET=ON \
-DARROW_DEPENDENCY_SOURCE=BUNDLED \
-DARROW_WITH_LZ4=OFF \
-DARROW_WITH_ZSTD=OFF \
-DARROW_WITH_SNAPPY=OFF \
-DARROW_BUILD_TESTS=OFF \
-DARROW_BUILD_BENCHMARKS=OFF \
.. && \
make -j$(nproc) VERBOSE=1 && \
make install -j$(nproc) && \
cd ../../python && \
pip install --no-cache-dir -r requirements-build.txt && \
PYARROW_WITH_PARQUET=1 \
PYARROW_WITH_DATASET=1 \
PYARROW_WITH_FILESYSTEM=1 \
PYARROW_WITH_JSON=1 \
PYARROW_WITH_CSV=1 \
PYARROW_PARALLEL=$(nproc) \
python setup.py build_ext --build-type=release --bundle-arrow-cpp bdist_wheel && \
mkdir -p /tmp/wheels && \
cp dist/pyarrow-*.whl /tmp/wheels/ && \
# Ensure wheels directory exists and has content
ls -la /tmp/wheels/; \
else \
# Create empty wheels directory for non-s390x
mkdir -p /tmp/wheels; \
fi
🧰 Tools
🪛 GitHub Actions: Code static analysis

[warning] 69-69: Hadolint DL3002 warning: Last USER should not be root.

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu around lines 66 to 118,
the build step for pyarrow on s390x installs several dependencies but omits git,
which causes the git clone command to fail. To fix this, add git to the list of
packages installed by dnf in the RUN command before cloning the arrow
repository, ensuring the git clone operation succeeds and the pyarrow wheel is
built correctly.

Comment on lines +139 to +148
USER 0
# Copy wheels from build stage (s390x only)
COPY --from=s390x-builder /tmp/wheels /tmp/wheels
RUN if [ "$TARGETARCH" = "s390x" ]; then \
pip install --no-cache-dir /tmp/wheels/*.whl && rm -rf /tmp/wheels; \
else \
echo "Skipping wheel install for $TARGETARCH"; \
fi

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Final image runs as root – switch back to user 1001.

USER 0 is set for the wheel copy, but never reset, so the resulting runtime image violates best-practice (Hadolint DL3002) and OpenShift will forcibly remap the UID anyway.

 RUN if [ "$TARGETARCH" = "s390x" ]; then \
     pip install --no-cache-dir /tmp/wheels/*.whl && rm -rf /tmp/wheels; \
 else \
     echo "Skipping wheel install for $TARGETARCH"; \
 fi
+
+# Drop privileges
+USER 1001
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
USER 0
# Copy wheels from build stage (s390x only)
COPY --from=s390x-builder /tmp/wheels /tmp/wheels
RUN if [ "$TARGETARCH" = "s390x" ]; then \
pip install --no-cache-dir /tmp/wheels/*.whl && rm -rf /tmp/wheels; \
else \
echo "Skipping wheel install for $TARGETARCH"; \
fi
USER 0
# Copy wheels from build stage (s390x only)
COPY --from=s390x-builder /tmp/wheels /tmp/wheels
RUN if [ "$TARGETARCH" = "s390x" ]; then \
pip install --no-cache-dir /tmp/wheels/*.whl && rm -rf /tmp/wheels; \
else \
echo "Skipping wheel install for $TARGETARCH"; \
fi
# Drop privileges
USER 1001
🧰 Tools
🪛 GitHub Actions: Code static analysis

[warning] 139-139: Hadolint DL3002 warning: Last USER should not be root.

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu around lines 139 to 147,
the Dockerfile switches to root user with USER 0 to copy and install wheels but
does not switch back to user 1001 afterward. To fix this, add a line setting
USER 1001 after the wheel installation block to ensure the final image runs as a
non-root user, complying with best practices and OpenShift requirements.

Comment on lines +21 to +25
# Exclude these packages on s390x architecture due to compatibility
codeflare-sdk = {version = "~=0.29.0", markers = "platform_machine != 's390x'"}
py-spy = {version = "~=0.4.0", markers = "platform_machine != 's390x'"}
ray = {version = "~=2.46.0", markers = "platform_machine != 's390x'", extras = ["data", "default"]}
pyarrow = {version = "~=21.0.0", markers = "platform_machine != 's390x'"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Confirm pyarrow version stays in sync between Pipfile and Dockerfile install path

The pyarrow dependency is omitted on s390x via a marker, while the Dockerfile compiles & installs the wheel manually for that architecture.
Please double-check that the version baked in the Dockerfile (pyarrow-X.Y.Z-*.whl) is kept identical to the ~=21.0.0 pin here; otherwise users will see diverging behaviour across architectures.

If the Dockerfile currently hard-codes another version, either:

  1. Align the pin below and regenerate the lock, or
  2. Pass ${PYARROW_VERSION} through build-args so the two sources can never drift.

🏁 Script executed:

#!/bin/bash
set -e

# Locate Dockerfile(s) for the datascience UBI9 Python 3.12 runtime
dockerfiles=$(find runtimes/datascience/ubi9-python-3.12 -type f -name 'Dockerfile*')

echo "Checking for pyarrow version pins in the following Dockerfile(s):"
echo "$dockerfiles"
echo

# Search each Dockerfile for hard-coded pyarrow wheel installs or build args
for df in $dockerfiles; do
  echo "=== $df ==="
  grep -En "pyarrow-.*\\.whl|PYARROW_VERSION" "$df" \
    || echo "No pyarrow reference found"
  echo
done

Length of output: 725


🏁 Script executed:

#!/bin/bash
# Display the first 200 lines of the CPU Dockerfile to inspect how the pyarrow wheel is built/pinned
sed -n '1,200p' runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

Length of output: 6576


Pin PyArrow Version in s390x Builder to Match Pipfile

The s390x-builder stage currently clones Apache Arrow’s main branch, resulting in a wheel version that will drift from the ~=21.0.0 pin in your Pipfile. To ensure consistency across architectures, update the Dockerfile to accept or define a PYARROW_VERSION build-arg and clone the corresponding tag.

Files to update:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (wheel-builder stage)

Suggested diff:

--- a/runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
+++ b/runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
@@
 FROM base AS s390x-builder
 
+ARG PYARROW_VERSION="21.0.0"
 ARG TARGETARCH 
 USER 0
 WORKDIR /tmp/build-wheels
 
 # Build pyarrow optimized for s390x
 RUN --mount=type=cache,target=/root/.cache/pip \
     --mount=type=cache,target=/root/.cache/dnf \
     if [ "$TARGETARCH" = "s390x" ]; then \
         # Install build dependencies...
-        git clone --depth 1 https://github.com/apache/arrow.git && \
+        git clone --branch "apache-arrow-${PYARROW_VERSION}" --depth 1 https://github.com/apache/arrow.git && \
         cd arrow/cpp && \
         mkdir release && cd release && \
         cmake ... && \

With this change you can:

  1. Build with a matching version:
    docker build . --build-arg PYARROW_VERSION=21.0.0 …
  2. Avoid any drift between the marker-pinned Pipfile dependency and the manually built wheel.
🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu within the wheel-builder
stage, the Apache Arrow source is cloned from the main branch causing version
drift from the PyArrow version pinned in the Pipfile (~=21.0.0). Modify the
Dockerfile to accept a build argument PYARROW_VERSION and use it to clone the
corresponding Apache Arrow tag instead of the main branch. This ensures the
built wheel version matches the Pipfile pin and maintains consistency across
architectures.

Comment on lines 699 to 770
grpcio==1.74.0; python_version >= '3.9' \
--hash=sha256:0f87bddd6e27fc776aacf7ebfec367b6d49cad0455123951e4488ea99d9b9b8f \
--hash=sha256:136b53c91ac1d02c8c24201bfdeb56f8b3ac3278668cbb8e0ba49c88069e1bdc \
--hash=sha256:1733969040989f7acc3d94c22f55b4a9501a30f6aaacdbccfaba0a3ffb255ab7 \
--hash=sha256:176d60a5168d7948539def20b2a3adcce67d72454d9ae05969a2e73f3a0feee7 \
--hash=sha256:1a2b06afe2e50ebfd46247ac3ba60cac523f54ec7792ae9ba6073c12daf26f0a \
--hash=sha256:1bf949792cee20d2078323a9b02bacbbae002b9e3b9e2433f2741c15bdeba1c4 \
--hash=sha256:22b834cef33429ca6cc28303c9c327ba9a3fafecbf62fae17e9a7b7163cc43ac \
--hash=sha256:2918948864fec2a11721d91568effffbe0a02b23ecd57f281391d986847982f6 \
--hash=sha256:2bc2d7d8d184e2362b53905cb1708c84cb16354771c04b490485fa07ce3a1d89 \
--hash=sha256:2f609a39f62a6f6f05c7512746798282546358a37ea93c1fcbadf8b2fed162e3 \
--hash=sha256:3601274bc0523f6dc07666c0e01682c94472402ac2fd1226fd96e079863bfa49 \
--hash=sha256:3b03d8f2a07f0fea8c8f74deb59f8352b770e3900d143b3d1475effcb08eec20 \
--hash=sha256:3d14e3c4d65e19d8430a4e28ceb71ace4728776fd6c3ce34016947474479683f \
--hash=sha256:42f8fee287427b94be63d916c90399ed310ed10aadbf9e2e5538b3e497d269bc \
--hash=sha256:4bc5fca10aaf74779081e16c2bcc3d5ec643ffd528d9e7b1c9039000ead73bae \
--hash=sha256:4e4181bfc24413d1e3a37a0b7889bea68d973d4b45dd2bc68bb766c140718f82 \
--hash=sha256:55b453812fa7c7ce2f5c88be3018fb4a490519b6ce80788d5913f3f9d7da8c7b \
--hash=sha256:566b9395b90cc3d0d0c6404bc8572c7c18786ede549cdb540ae27b58afe0fb91 \
--hash=sha256:5f251c355167b2360537cf17bea2cf0197995e551ab9da6a0a59b3da5e8704f9 \
--hash=sha256:60d2d48b0580e70d2e1954d0d19fa3c2e60dd7cbed826aca104fff518310d1c5 \
--hash=sha256:64229c1e9cea079420527fa8ac45d80fc1e8d3f94deaa35643c381fa8d98f362 \
--hash=sha256:655726919b75ab3c34cdad39da5c530ac6fa32696fb23119e36b64adcfca174a \
--hash=sha256:662456c4513e298db6d7bd9c3b8df6f75f8752f0ba01fb653e252ed4a59b5a5d \
--hash=sha256:68c8ebcca945efff9d86d8d6d7bfb0841cf0071024417e2d7f45c5e46b5b08eb \
--hash=sha256:69e1a8180868a2576f02356565f16635b99088da7df3d45aaa7e24e73a054e31 \
--hash=sha256:6bab67d15ad617aff094c382c882e0177637da73cbc5532d52c07b4ee887a87b \
--hash=sha256:7d95d71ff35291bab3f1c52f52f474c632db26ea12700c2ff0ea0532cb0b5854 \
--hash=sha256:80d1f4fbb35b0742d3e3d3bb654b7381cd5f015f8497279a1e9c21ba623e01b1 \
--hash=sha256:834988b6c34515545b3edd13e902c1acdd9f2465d386ea5143fb558f153a7176 \
--hash=sha256:8533e6e9c5bd630ca98062e3a1326249e6ada07d05acf191a77bc33f8948f3d8 \
--hash=sha256:85bd5cdf4ed7b2d6438871adf6afff9af7096486fcf51818a81b77ef4dd30907 \
--hash=sha256:86ad489db097141a907c559988c29718719aa3e13370d40e20506f11b4de0d11 \
--hash=sha256:885912559974df35d92219e2dc98f51a16a48395f37b92865ad45186f294096c \
--hash=sha256:8efe72fde5500f47aca1ef59495cb59c885afe04ac89dd11d810f2de87d935d4 \
--hash=sha256:8f7b5882fb50632ab1e48cb3122d6df55b9afabc265582808036b6e51b9fd6b7 \
--hash=sha256:9e7c4389771855a92934b2846bd807fc25a3dfa820fd912fe6bd8136026b2707 \
--hash=sha256:9e912d3c993a29df6c627459af58975b2e5c897d93287939b9d5065f000249b5 \
--hash=sha256:a8f0302f9ac4e9923f98d8e243939a6fb627cd048f5cd38595c97e38020dffce \
--hash=sha256:b6a73b2ba83e663b2480a90b82fdae6a7aa6427f62bf43b29912c0cfd1aa2bfa \
--hash=sha256:c14e803037e572c177ba54a3e090d6eb12efd795d49327c5ee2b3bddb836bf01 \
--hash=sha256:c3d7bd6e3929fd2ea7fbc3f562e4987229ead70c9ae5f01501a46701e08f1ad9 \
--hash=sha256:c98e0b7434a7fa4e3e63f250456eaef52499fba5ae661c58cc5b5477d11e7182 \
--hash=sha256:cce634b10aeab37010449124814b05a62fb5f18928ca878f1bf4750d1f0c815b \
--hash=sha256:e154d230dc1bbbd78ad2fdc3039fa50ad7ffcf438e4eb2fa30bce223a70c7486 \
--hash=sha256:e1ea6176d7dfd5b941ea01c2ec34de9531ba494d541fe2057c904e601879f249 \
--hash=sha256:e759f9e8bc908aaae0412642afe5416c9f983a80499448fcc7fab8692ae044c3 \
--hash=sha256:e8978003816c7b9eabe217f88c78bc26adc8f9304bf6a594b02e5a49b2ef9c11 \
--hash=sha256:ecde9ab49f58433abe02f9ed076c7b5be839cf0153883a6d23995937a82392fa \
--hash=sha256:f6ec94f0e50eb8fa1744a731088b966427575e40c2944a980049798b127a687e \
--hash=sha256:fd3c71aeee838299c5887230b8a1822795325ddfea635edd82954c1eaa831e24 \
--hash=sha256:fe0f540750a13fd8e5da4b3eaba91a785eea8dca5ccd2bc2ffe978caa403090e
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Confirm grpcio 1.74.0 wheels exist for every target architecture before the build matrix breaks
The image set now builds for x86_64, arm64 and (after this PR) s390x. grpcio historically lags aarch64 & s390x wheel uploads, and 1.74.0 is the first release after the 1.73.x line. A missing wheel forces a slow source build inside the container and explodes build time (especially on s390x).

Run the quick check below; expect to see filenames for all needed arches (x86_64, aarch64, s390x) and cp312:

If any architecture is missing, pin back to 1.73.1 for that arch or add an architecture marker to avoid a source compile.


🏁 Script executed:

#!/bin/bash
# Inspect available wheels for grpcio 1.74.0
curl -s https://pypi.org/pypi/grpcio/1.74.0/json |
  jq -r '.urls[].filename' |
  grep -E 'cp312|cp311' |
  grep -E 'manylinux_2_17_(x86_64|aarch64|s390x)'

Length of output: 403


Missing s390x wheel for grpcio 1.74.0
The check shows only x86_64 and aarch64 cp311/cp312 wheels; there is no manylinux_2_17_s390x build. Without an s390x wheel, the image will fall back to a slow source compile on that architecture.

Action required:

  • Pin grpcio to 1.73.1 for s390x builds, or
  • Add an environment marker (e.g. ; platform_system != "s390x") to prevent installing 1.74.0 on s390x.
🤖 Prompt for AI Agents
In runtimes/tensorflow/ubi9-python-3.12/requirements.txt around lines 699 to
750, grpcio version 1.74.0 lacks a prebuilt wheel for the s390x architecture,
causing slow source compilation on s390x. To fix this, modify the grpcio
dependency to pin version 1.73.1 specifically for s390x builds or add an
environment marker to exclude version 1.74.0 from installing on s390x platforms,
ensuring faster and compatible installations.

@Nash-123 Nash-123 force-pushed the runtime-datascience-s390x branch from 8053c7e to aab32bb Compare July 28, 2025 20:12
@Nash-123 Nash-123 force-pushed the runtime-datascience-s390x branch from aab32bb to a881362 Compare July 28, 2025 20:16
@openshift-ci openshift-ci bot added size/l and removed size/l labels Jul 28, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (5)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (3)

22-24: Remove blanket --nogpgcheck; keep RPM signature verification intact
Disabling GPG checks for every package weakens the supply-chain guarantees we get from UBI. If you really need to bypass GPG for a specific third-party repo, scope it with --repo=<name> instead of disabling verification globally.

-        dnf install -y --nogpgcheck --allowerasing --nobest $PACKAGES && \
+        dnf install -y --allowerasing --nobest $PACKAGES && \

75-80: git is still missing – arrow clone will fail
The build step immediately calls git clone, but git is not in the dependency list. The wheel build will error on s390x.

-        dnf install -y cmake make gcc-c++ pybind11-devel wget && \
+        dnf install -y cmake make gcc-c++ git pybind11-devel wget && \

138-145: Final image still runs as root – switch back to the non-privileged user
USER 0 is set to install wheels but never reset, triggering Hadolint DL3002 and violating OpenShift best-practice.

 RUN if [ "$TARGETARCH" = "s390x" ]; then \
     pip install --no-cache-dir /tmp/wheels/*.whl && rm -rf /tmp/wheels; \
 else \
     echo "Skipping wheel install for $TARGETARCH"; \
 fi
+
+# Drop privileges for the final image
+USER 1001
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (2)

14-23: Drop --nogpgcheck: supply-chain risk persists
The flag is still present even after the earlier review. Red Hat RPMs are signed; disabling the check weakens integrity guarantees and is flagged by Checkov CKV2_DOCKER_9.

-    dnf install -y --nogpgcheck --allowerasing --nobest $PACKAGES && \
+    dnf install -y --allowerasing --nobest $PACKAGES && \

127-142: Image still ends with USER root
The last USER instruction is at line 127; Hadolint DL3002 remains. Switch back to 1001 after package installs & permission fixes.

@@
     chmod -R g+w /opt/app-root/lib/python3.11/site-packages && \
     fix-permissions /opt/app-root -P
+
+# Security best practice: drop root
+USER 1001
🧹 Nitpick comments (2)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (2)

92-95: Re-enable compression libraries for Parquet/Arrow usability (optional)
Disabling LZ4, ZSTD and Snappy drastically limits real-world Parquet usage. These libs build cleanly on s390x and only add a few MB. Consider leaving them ON (they are bundled anyway).

-              -DARROW_WITH_LZ4=OFF \
-              -DARROW_WITH_ZSTD=OFF \
-              -DARROW_WITH_SNAPPY=OFF \
+              -DARROW_WITH_LZ4=ON \
+              -DARROW_WITH_ZSTD=ON \
+              -DARROW_WITH_SNAPPY=ON \

56-59: Prefer ${TARGETARCH} over $(uname -m) when downloading the oc client
Inside BuildKit cross-platform builds uname -m returns the emulated architecture, but using ${TARGETARCH} is clearer and avoids surprises if this Dockerfile is ever built without QEMU.

-RUN curl -L https://mirror.openshift.com/pub/openshift-v4/$(uname -m)/clients/ocp/stable/openshift-client-linux.tar.gz \
+RUN curl -L https://mirror.openshift.com/pub/openshift-v4/${TARGETARCH}/clients/ocp/stable/openshift-client-linux.tar.gz \

Note: for amd64 → x86_64 and arm64 → aarch64 you may need a small mapping case-statement.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between aab32bb and a881362.

📒 Files selected for processing (4)
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (3 hunks)
  • runtimes/datascience/ubi9-python-3.11/Pipfile (1 hunks)
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • runtimes/datascience/ubi9-python-3.12/Pipfile (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • runtimes/datascience/ubi9-python-3.12/Pipfile
  • runtimes/datascience/ubi9-python-3.11/Pipfile
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py:619-626
Timestamp: 2025-07-08T19:33:14.340Z
Learning: jiridanek requested GitHub issue creation for Python 3.12 version check bug in bootstrapper.py during PR #1333 review. Issue #1348 was created with comprehensive problem description covering version check exclusion affecting all Python 3.12 runtime images, detailed impact analysis of bootstrapper execution failures, clear solution with code examples, affected files list including all 6 runtime bootstrapper copies, acceptance criteria for testing and verification, implementation notes about code duplication and upstream reporting, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (33)

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.

Learnt from: atheo89
PR: #1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: grdryn
PR: #1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1364 was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:132-134
Timestamp: 2025-07-08T19:07:58.135Z
Learning: jiridanek requested GitHub issue creation for Docker build robustness problem with rm glob patterns during PR #1306 review. Issue was created covering 11 affected Dockerfiles with fragile rm commands that fail on empty globs, comprehensive problem description, multiple solution options, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda:17-25
Timestamp: 2025-07-09T08:07:30.628Z
Learning: jiridanek requested GitHub issue creation for oc client installation permission problem in PyTorch CUDA runtime Dockerfile during PR #1333 review. Issue #1356 was created with comprehensive problem description covering USER 1001 permission conflicts with root-owned /opt/app-root/bin directory, detailed impact analysis of build failures and non-executable binaries, current problematic code snippet, complete solution with user switching approach, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template:14-14
Timestamp: 2025-07-03T16:28:55.757Z
Learning: jiridanek requested GitHub issue creation for passwd template validation in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template during PR #1269 review. Issue #1318 was created with comprehensive analysis of nss_wrapper approach advantages over OpenShift's native user management, including consistent user identity, application compatibility, and debugging benefits. The current approach works with OpenShift's random UID assignment by dynamically setting USER_ID and GROUP_ID to actual runtime values while providing enhanced user attributes through nss_wrapper.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (30)

Learnt from: atheo89
PR: #1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.

Learnt from: jiridanek
PR: #1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Learnt from: jiridanek
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:55-56
Timestamp: 2025-07-03T08:22:25.348Z
Learning: jiridanek directs security concerns raised during PR reviews to existing comprehensive security issues rather than addressing them in individual PRs. Issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" serves as the central tracking issue for all binary download security concerns across the opendatahub-io/notebooks repository.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: #1333
File: runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda:17-25
Timestamp: 2025-07-09T08:07:30.628Z
Learning: jiridanek requested GitHub issue creation for oc client installation permission problem in PyTorch CUDA runtime Dockerfile during PR #1333 review. Issue #1356 was created with comprehensive problem description covering USER 1001 permission conflicts with root-owned /opt/app-root/bin directory, detailed impact analysis of build failures and non-executable binaries, current problematic code snippet, complete solution with user switching approach, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template:14-14
Timestamp: 2025-07-03T16:28:55.757Z
Learning: jiridanek requested GitHub issue creation for passwd template validation in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template during PR #1269 review. Issue #1318 was created with comprehensive analysis of nss_wrapper approach advantages over OpenShift's native user management, including consistent user identity, application compatibility, and debugging benefits. The current approach works with OpenShift's random UID assignment by dynamically setting USER_ID and GROUP_ID to actual runtime values while providing enhanced user attributes through nss_wrapper.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

🪛 Checkov (3.2.334)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[MEDIUM] 14-24: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[MEDIUM] 14-23: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

🪛 GitHub Actions: Code static analysis
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[warning] 27-27: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.


[warning] 68-68: Hadolint DL3002 warning: Last USER should not be root.


[warning] 138-138: Hadolint DL3002 warning: Last USER should not be root.

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[warning] 26-26: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.


[warning] 63-63: Hadolint DL3002 warning: Last USER should not be root.


[warning] 127-127: Hadolint DL3002 warning: Last USER should not be root.

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (1)

52-56: s390x client available – no gating needed
A HEAD request against https://mirror.openshift.com/pub/openshift-v4/s390x/clients/ocp/stable/openshift-client-linux.tar.gz returns HTTP 200, confirming the s390x archive exists. The current curl … $(uname -m) … invocation works correctly on s390x, so no additional TARGETARCH check or RPM-based fallback is required.

Likely an incorrect or invalid review comment.

Comment on lines +25 to +37
# Install Rust and set environment variables (s390x only)
RUN if [ "$TARGETARCH" = "s390x" ]; then \
mkdir -p /opt/.cargo && \
export HOME=/root && \
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
chmod +x rustup-init.sh && \
CARGO_HOME=/opt/.cargo ./rustup-init.sh -y --no-modify-path && \
rm -f rustup-init.sh && \
chown -R 1001:0 /opt/.cargo && \
echo 'export PATH=/opt/.cargo/bin:$PATH' >> /etc/profile.d/cargo.sh && \
echo 'export CARGO_HOME=/opt/.cargo' >> /etc/profile.d/cargo.sh && \
echo 'export GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1' >> /etc/profile.d/cargo.sh; \
fi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Execute remote scripts only with explicit checksums
rustup-init.sh is downloaded and executed without verifying a checksum or signature. This is a classic supply-chain attack vector and violates the project’s own #1241 guidance on binary verification.

-    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
+    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
+    echo "<SHA256  expected>  rustup-init.sh" | sha256sum -c - && \

Consider pinning to the official static install script hash (updated per release) or install the rust-toolset RPM once it appears for s390x.

🧰 Tools
🪛 GitHub Actions: Code static analysis

[warning] 26-26: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu around lines 25 to 37,
the rustup-init.sh script is downloaded and executed without verifying its
checksum, which is a security risk. To fix this, add a step to download the
official checksum for the rustup-init.sh script and verify it matches the
downloaded file before execution. Alternatively, consider installing Rust via a
trusted package like the rust-toolset RPM for s390x once available, to avoid
executing remote scripts directly.

Comment on lines +69 to +109
RUN --mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/dnf \
if [ "$TARGETARCH" = "s390x" ]; then \
echo "Building pyarrow wheel for s390x..." && \
dnf install -y cmake make gcc-c++ pybind11-devel wget && \
dnf clean all && \
git clone --depth 1 https://github.com/apache/arrow.git && \
cd arrow/cpp && \
mkdir release && cd release && \
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DARROW_PYTHON=ON \
-DARROW_PARQUET=ON \
-DARROW_ORC=ON \
-DARROW_FILESYSTEM=ON \
-DARROW_JSON=ON \
-DARROW_CSV=ON \
-DARROW_DATASET=ON \
-DARROW_DEPENDENCY_SOURCE=BUNDLED \
-DCMAKE_CXX_FLAGS="-O3 -march=z14 -mtune=z14" \
-DCMAKE_C_FLAGS="-O3 -march=z14 -mtune=z14" \
.. && \
make -j$(nproc) && \
make install && \
cd ../../python && \
pip install --no-cache-dir -U pip wheel setuptools && \
pip install --no-cache-dir -r requirements-build.txt && \
export PYARROW_PARALLEL=$(nproc) && \
export ARROW_BUILD_TYPE=release && \
CFLAGS="-O3 -march=z14 -mtune=z14" \
CXXFLAGS="-O3 -march=z14 -mtune=z14" \
python setup.py build_ext --build-type=release --bundle-arrow-cpp bdist_wheel && \
mkdir -p /tmp/wheels && \
cp dist/pyarrow-*.whl /tmp/wheels/ && \
ls -la /tmp/wheels/; \
else \
echo "Not s390x, skipping wheel build" && mkdir -p /tmp/wheels; \
fi

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Unpinned Arrow git clone hurts reproducibility
git clone --depth 1 https://github.com/apache/arrow.git fetches whatever HEAD happens to be at build time, making builds non-reproducible and brittle.

-        git clone --depth 1 https://github.com/apache/arrow.git && \
+        ARROW_TAG=apache-arrow-16.1.0 && \
+        git clone --branch ${ARROW_TAG} --depth 1 https://github.com/apache/arrow.git && \

Pin to a known tag/revision and add a comment when updating to a newer release.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
RUN --mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/dnf \
if [ "$TARGETARCH" = "s390x" ]; then \
echo "Building pyarrow wheel for s390x..." && \
dnf install -y cmake make gcc-c++ pybind11-devel wget && \
dnf clean all && \
git clone --depth 1 https://github.com/apache/arrow.git && \
cd arrow/cpp && \
mkdir release && cd release && \
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DARROW_PYTHON=ON \
-DARROW_PARQUET=ON \
-DARROW_ORC=ON \
-DARROW_FILESYSTEM=ON \
-DARROW_JSON=ON \
-DARROW_CSV=ON \
-DARROW_DATASET=ON \
-DARROW_DEPENDENCY_SOURCE=BUNDLED \
-DCMAKE_CXX_FLAGS="-O3 -march=z14 -mtune=z14" \
-DCMAKE_C_FLAGS="-O3 -march=z14 -mtune=z14" \
.. && \
make -j$(nproc) && \
make install && \
cd ../../python && \
pip install --no-cache-dir -U pip wheel setuptools && \
pip install --no-cache-dir -r requirements-build.txt && \
export PYARROW_PARALLEL=$(nproc) && \
export ARROW_BUILD_TYPE=release && \
CFLAGS="-O3 -march=z14 -mtune=z14" \
CXXFLAGS="-O3 -march=z14 -mtune=z14" \
python setup.py build_ext --build-type=release --bundle-arrow-cpp bdist_wheel && \
mkdir -p /tmp/wheels && \
cp dist/pyarrow-*.whl /tmp/wheels/ && \
ls -la /tmp/wheels/; \
else \
echo "Not s390x, skipping wheel build" && mkdir -p /tmp/wheels; \
fi
dnf clean all && \
- git clone --depth 1 https://github.com/apache/arrow.git && \
+ ARROW_TAG=apache-arrow-16.1.0 && \
+ git clone --branch ${ARROW_TAG} --depth 1 https://github.com/apache/arrow.git && \
cd arrow/cpp && \
mkdir release && cd release && \
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DARROW_PYTHON=ON \
-DARROW_PARQUET=ON \
-DARROW_ORC=ON \
-DARROW_FILESYSTEM=ON \
-DARROW_JSON=ON \
-DARROW_CSV=ON \
-DARROW_DATASET=ON \
-DARROW_DEPENDENCY_SOURCE=BUNDLED \
-DCMAKE_CXX_FLAGS="-O3 -march=z14 -mtune=z14" \
-DCMAKE_C_FLAGS="-O3 -march=z14 -mtune=z14" \
.. && \
make -j$(nproc) && \
make install && \
cd ../../python && \
pip install --no-cache-dir -U pip wheel setuptools && \
pip install --no-cache-dir -r requirements-build.txt && \
export PYARROW_PARALLEL=$(nproc) && \
export ARROW_BUILD_TYPE=release && \
CFLAGS="-O3 -march=z14 -mtune=z14" \
CXXFLAGS="-O3 -march=z14 -mtune=z14" \
python setup.py build_ext --build-type=release --bundle-arrow-cpp bdist_wheel && \
mkdir -p /tmp/wheels && \
cp dist/pyarrow-*.whl /tmp/wheels/ && \
ls -la /tmp/wheels/; \
🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu around lines 69 to 107,
the git clone command for Apache Arrow uses the default branch HEAD, which makes
builds non-reproducible and brittle. To fix this, pin the clone to a specific
known tag or commit hash by adding a checkout command after cloning or cloning
directly with a tag reference. Also, add a comment explaining the pinned version
and instructions for updating it to maintain clarity and reproducibility.

@openshift-ci openshift-ci bot added size/xxl and removed size/l labels Jul 28, 2025
--hash=sha256:fd907ae12cd483cd83e414b12941c632a969171bf90fc937d0c9f268a31cafff \
--hash=sha256:fd914713266421b7536de2bfa8181aa8c699432b6763a0ea64195ebe28bff6a9 \
--hash=sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a
cryptography==43.0.3; platform_machine != 's390x' \

Check notice

Code scanning / Trivy

openssl: RFC7250 handshakes with unauthenticated servers don't abort as expected Low library

Package: cryptography
Installed Version: 43.0.3
Vulnerability CVE-2024-12797
Severity: LOW
Fixed Version: 44.0.1
Link: CVE-2024-12797
@openshift-ci openshift-ci bot added size/xxl and removed size/xxl labels Jul 28, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🔭 Outside diff range comments (1)
runtimes/datascience/ubi9-python-3.11/requirements.txt (1)

453-480: Bump cryptography – CVE-2024-12797 still open

Trivy flags 43.0.3 as vulnerable (RFC7250 handshake bypass).
Version ≥ 44.0.1 is required. Keeping the vulnerable build in all non-s390x images defeats the purpose of excluding only on s390x.

-cryptography==43.0.3; platform_machine != 's390x' \
+cryptography==44.0.1; platform_machine != 's390x' \

Remember to regenerate the hashes via scripts/sync-requirements-txt.sh after updating the Pipfile.lock.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a881362 and fac7504.

⛔ Files ignored due to path filters (14)
  • codeserver/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/datascience/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/minimal/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/rocm/pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/rocm/tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/trustyai/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/minimal/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/rocm-pytorch/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/rocm-tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/tensorflow/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (14)
  • codeserver/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/datascience/ubi9-python-3.11/requirements.txt (4 hunks)
  • jupyter/minimal/ubi9-python-3.11/requirements.txt (2 hunks)
  • jupyter/pytorch/ubi9-python-3.11/requirements.txt (4 hunks)
  • jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt (4 hunks)
  • jupyter/rocm/tensorflow/ubi9-python-3.11/requirements.txt (4 hunks)
  • jupyter/tensorflow/ubi9-python-3.11/requirements.txt (5 hunks)
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt (5 hunks)
  • runtimes/datascience/ubi9-python-3.11/requirements.txt (23 hunks)
  • runtimes/minimal/ubi9-python-3.11/requirements.txt (1 hunks)
  • runtimes/pytorch/ubi9-python-3.11/requirements.txt (3 hunks)
  • runtimes/rocm-pytorch/ubi9-python-3.11/requirements.txt (3 hunks)
  • runtimes/rocm-tensorflow/ubi9-python-3.11/requirements.txt (3 hunks)
  • runtimes/tensorflow/ubi9-python-3.11/requirements.txt (4 hunks)
✅ Files skipped from review due to trivial changes (7)
  • runtimes/minimal/ubi9-python-3.11/requirements.txt
  • jupyter/minimal/ubi9-python-3.11/requirements.txt
  • runtimes/pytorch/ubi9-python-3.11/requirements.txt
  • runtimes/rocm-pytorch/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
  • jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt
  • jupyter/rocm/tensorflow/ubi9-python-3.11/requirements.txt
🧰 Additional context used
🧠 Learnings (8)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py:619-626
Timestamp: 2025-07-08T19:33:14.340Z
Learning: jiridanek requested GitHub issue creation for Python 3.12 version check bug in bootstrapper.py during PR #1333 review. Issue #1348 was created with comprehensive problem description covering version check exclusion affecting all Python 3.12 runtime images, detailed impact analysis of bootstrapper execution failures, clear solution with code examples, affected files list including all 6 runtime bootstrapper copies, acceptance criteria for testing and verification, implementation notes about code duplication and upstream reporting, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
codeserver/ubi9-python-3.11/requirements.txt (11)

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

runtimes/datascience/ubi9-python-3.11/requirements.txt (29)

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if libxcrypt-compat is missing. The solution is to install libxcrypt-compat in the Dockerfile.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core jupyterlab package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with extras = ["and-cuda"] can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Learnt from: jiridanek
PR: #1197
File: runtimes/minimal/ubi9-python-3.11/requirements.txt:395-405
Timestamp: 2025-06-26T15:28:35.416Z
Learning: psutil version 7.x is compatible with UBI9, CentOS Stream 9, and RHEL 9 platforms in the opendatahub-io/notebooks repository. The upgrade from psutil 5.x to 7.x has been validated for these environments.

Learnt from: jiridanek
PR: #1463
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:15-15
Timestamp: 2025-07-22T14:31:20.315Z
Learning: PyTorch 2.6.0 wheels with CUDA 12.4 support (torch-2.6.0+cu124) are available on the official PyTorch CUDA index at https://download.pytorch.org/whl/cu124/ for Python versions 3.9, 3.10, 3.11, 3.12, and 3.13 on both Linux x86_64 and Windows AMD64 platforms.

Learnt from: jiridanek
PR: #1325
File: jupyter/datascience/ubi9-python-3.11/Pipfile:32-32
Timestamp: 2025-07-07T14:54:57.907Z
Learning: odh-elyra uses universal wheels (py3-none-any) rather than platform-specific wheels, making it compatible with all supported Python versions through a single wheel file.

runtimes/tensorflow/ubi9-python-3.11/requirements.txt (12)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

jupyter/pytorch/ubi9-python-3.11/requirements.txt (15)

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

jupyter/datascience/ubi9-python-3.11/requirements.txt (15)

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: jiridanek
PR: #1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires urllib3<3,>=1.21.1 (not urllib3<2 as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

runtimes/rocm-tensorflow/ubi9-python-3.11/requirements.txt (14)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

jupyter/tensorflow/ubi9-python-3.11/requirements.txt (13)

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Learnt from: grdryn
PR: #1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: #1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12) to avoid host OS dependency conflicts when generating Pipfile.lock files.

🪛 GitHub Check: Trivy
runtimes/datascience/ubi9-python-3.11/requirements.txt

[notice] 453-453: openssl: RFC7250 handshakes with unauthenticated servers don't abort as expected
Package: cryptography
Installed Version: 43.0.3
Vulnerability CVE-2024-12797
Severity: LOW
Fixed Version: 44.0.1
Link: CVE-2024-12797

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (19)
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.11, 3.11, linux/arm64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.11, 3.11, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.11, 3.11, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.11, 3.11, linux/amd64, false) / build
🔇 Additional comments (24)
codeserver/ubi9-python-3.11/requirements.txt (3)

116-118: Patch-level bump to comm looks safe.

comm 0.2.3 is a micro-release with only bug-fixes; no backward-incompatible changes are reported upstream.
No action needed.


119-130: contourpy marker tightened to Python ≥ 3.11 – double-check cross-image consistency.

This file targets a 3.11 base image, so the stricter marker is fine here.
However, the same requirement is duplicated in many other requirements.txt / Pipfile combinations across the repo. Please make sure that:

  1. All Python 3.10 images (if any remain) still receive a compatible version (e.g. keep 1.3.2 or bump with a looser marker).
  2. The lock-file regeneration step used --python-version 3.11 to avoid accidentally excluding wheels for the 3.11 build matrix.

No code change required if those points have already been verified.


444-446: Verify compatibility of the narwhals 2.0.0 upgrade

A jump from 1.48.0 → 2.0.0 may introduce breaking changes (e.g. scan_*collect_* renames, removed aliases) and the package now requires pandas>=2.2,<3 (we ship 2.2.3, so the dependency is met). Please:

• Run all affected notebooks and workflows that import narwhals to confirm nothing breaks.
• If any imports or helper scripts fail, temporarily pin to narwhals<2 until you’ve migrated to the new API.

Let me know if you’d like a quick compatibility test script.

runtimes/datascience/ubi9-python-3.11/requirements.txt (2)

1654-1696: Verify that pyarrow==21.0.0 actually exists on PyPI

The latest Arrow releases stopped at 16.x (as of July 2025). A 21.x tag will resolve to nothing and break non-s390x builds at install time.

Please confirm the intended version and align with the wheel you build for s390x in the Dockerfile (apache-arrow-*-cp311-…whl).
If 16.x is the target, update here (and the builder stage) accordingly.


2351-2353: Excluding six on s390x may cause hidden reinstalls

Several runtime deps (e.g. boto3/urllib3 stacks) still list six>=1.16.
Excluding it here (six==1.17.0; platform_machine != 's390x') means:

  1. Pip will silently reinstall six as a transitive dependency, defeating the marker.
  2. A different version may be pulled, diverging from the locked hash set.

Either keep six un-gated or add an s390x wheel to the hash list so the lockfile remains deterministic across arches.

jupyter/datascience/ubi9-python-3.11/requirements.txt (5)

427-429: comm bump looks fine

comm==0.2.3 is compatible with the Jupyter stack (ipykernel ≥ 6.25) and has no breaking API changes.
Nothing else to flag.


430-436: Verify the tighter Python-version gate on contourpy

The marker changed from >=3.10 to >=3.11.
That is fine for this 3.11 image, but it will block re-use of the same requirements file for 3.10 environments (e.g. downstream testing or older N-1 tags).

Please double-check that no 3.10 consumers still ingest this file.


800-802: GitPython patch upgrade – no concerns

3 .1 .45 is a maintenance release; no API changes affecting notebooks were introduced.
Looks good.


1440-1442: Major‐version jump to narwhals 2.0.0 – run functional tests

narwhals 2.x introduces a rewritten execution engine and type-checking changes. Projects relying on the legacy 1.x API (polars, pandas-backend adapters, etc.) may break.

Ensure the datascience image’s notebook extensions that import narwhals (e.g. polars profiling helpers) still run end-to-end tests.


856-865: Monitor s390x build time for grpcio==1.74.0

grpcio v1.74.0 no longer publishes s390x wheels on PyPI, so installing it in the final image will fall back to building the C extension:

  • Adds ~5–6 minutes per layer.
  • Requires a C toolchain (gcc, make, libstdc++-devel) which isn’t present in the final stage by default.

If you keep the toolchain only in the builder stage, ensure you:

  • Install gcc, make, and libstdc++-devel in the builder stage.
  • COPY the resulting .whl from the builder into the final stage before running pip install -r requirements.txt.

Alternatively, pin to grpcio==1.73.* for s390x or vendor the built wheel.
File: jupyter/datascience/ubi9-python-3.11/requirements.txt (lines 856–865)

jupyter/pytorch/ubi9-python-3.11/requirements.txt (4)

431-433: Minor bump to comm looks safe
Only a patch-level change (0.2.2 → 0.2.3). No downstream packages in this image are known to constrain the comm API, so no action needed.


434-468: Confirm the tighter Python constraint for contourpy (>=3.11)
The new wheel requires Python ≥ 3.11. The image is indeed Python 3.11, so we’re fine, but be aware this breaks any optional reuse of the file for 3.10 images.


804-806: gitpython 3.1.45 – security-fix release, good update
Includes CVE-2024-32032 fix; 👍 to upgrade.


860-911: grpcio 1.74.0 – verify binary wheel availability on s390x
grpcio often lags on non-x86 arches. Please ensure the CI build for s390x (TARGETARCH=s390x) still downloads a wheel; otherwise a lengthy from-source build will occur.

runtimes/tensorflow/ubi9-python-3.11/requirements.txt (5)

383-385: comm upgrade LGTM

Patch-level bump from 0.2.2 → 0.2.3 is safe and keeps the Jupyter stack in sync with upstream.


386-390: New contourpy==1.3.3 now requires Python ≥ 3.11 – double-check other images

1.3.3 drops support for Python 3.10.
This requirements file targets Python 3.11 so it’s fine here, but please ensure no shared-lock is reused by 3.10-based images (e.g. minimal / datascience 3.10) or CI lock-generation jobs that still emulate 3.10.


714-718: Verify grpcio==1.74.0 wheel availability for all build platforms

The new version fixes several CVEs, but wheels for some secondary arches (ppc64le, s390x) were historically lagging.
Please confirm BuildKit won’t fall back to source builds (adds +5 min & compiler deps).


1287-1289: narwhals major upgrade (1.x → 2.0.0) – check API breaking changes

Narwhals 2 removes the legacy pandas interface and renames SeriesManager.
If notebooks / pipelines import these symbols, they will break at runtime.


1450-1455: optree minor bump looks safe

0.17.0 only extends typing support; no breaking changes noted.

runtimes/rocm-tensorflow/ubi9-python-3.11/requirements.txt (4)

383-385: Minor version bump looks good

comm 0.2.2 → 0.2.3 is a pure bug-fix release; no compatibility concerns.


386-459: New python_version >= '3.11' gate ‑ double-check consumer images

contourpy 1.3.3 now blocks Python < 3.11.
The Dockerfile for this image is ubi9-python-3.11, so we’re safe, but any future 3.10 build that re-uses this file will fail dependency resolution. Just keep that in mind.


1280-1282: narwhals 2.0.0 upgrade is safe—no downstream references found

A full repo-wide search for any import or usage of narwhals (e.g. import narwhals, from narwhals, narwhals.) returned no hits outside of the requirements file. There are therefore no custom wrappers, notebooks, or other modules depending on the old API that need attention. Feel free to proceed with this bump.


717-768: Confirm grpcio==1.74.0 s390x wheel availability
I attempted to query the PyPI JSON API in our sandbox but hit SSL certificate errors. Please manually verify that grpcio==1.74.0 ships a manylinux2014_s390x wheel on PyPI (for example via pip index versions grpcio --platform s390x or by inspecting https://pypi.org/project/grpcio/1.74.0/#files).
Without a pre-built wheel, the build will fall back to Bazel source compilation on s390x, adding ~12–15 min to image build time and often requiring GRPC_PYTHON_BUILD_WITH_CYTHON=1 plus custom LDFLAGS.

jupyter/tensorflow/ubi9-python-3.11/requirements.txt (1)

1703-1788: optree 0.17 wheels not published for some alt-arches – confirm build still succeeds

optree distributes only x86_64/arm64 wheels. Other arches (e.g. s390x, ppc64le, aarch64-musl) build from source and require a Rust tool-chain.
The TensorFlow Jupyter 3.11 image currently targets x86_64/arm64, so you’re probably safe, but:

  1. If you later extend this image to s390x (like the datascience runtimes in this PR), add a pip install guard or provide a Rust tool-chain.
  2. Monitor build time; source–compilation of optree adds ~2-3 min.

Comment on lines 1450 to 1452
narwhals==2.0.0; python_version >= '3.9' \
--hash=sha256:9c9fe8a969b090d783edbcb3b58e1d0d15f5100fdf85b53f5e76d38f4ce7f19a \
--hash=sha256:d967bea54dfb6cd787abf3865ab4d72b8259d8f798c1c12c4eb693d5e9cebb24
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Major‐version bump of narwhals – check for breaking API changes
narwhals 2.x drops several deprecated symbols. Nothing in the notebooks repository directly imports it, but transitive users (e.g., Polars) might. If runtime tests aren’t exercising those paths, add at least one smoke test or pin back to 1.x until validated.

🤖 Prompt for AI Agents
In jupyter/pytorch/ubi9-python-3.11/requirements.txt at lines 1450 to 1452, the
narwhals package was upgraded to major version 2.0.0, which may introduce
breaking API changes. Verify if any transitive dependencies like Polars rely on
narwhals and ensure runtime tests cover those code paths. If no tests exist, add
a smoke test to validate compatibility or revert the narwhals version to 1.x
until confirmed safe.

Comment on lines 1525 to 1527
narwhals==2.0.0; python_version >= '3.9' \
--hash=sha256:9c9fe8a969b090d783edbcb3b58e1d0d15f5100fdf85b53f5e76d38f4ce7f19a \
--hash=sha256:d967bea54dfb6cd787abf3865ab4d72b8259d8f798c1c12c4eb693d5e9cebb24
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Major‐version jump for narwhals introduces API-breaking changes – verify downstream usage

narwhals 2.x is a hard break from the 1.x API (e.g. new Frame/Series naming, dropped pandas interop helpers).
Search any local notebooks or libraries that import narwhals and run their unit tests; silent runtime errors are easy to miss in data-science images.
If you only need the polars backend, consider pinning to narwhals~=1.48 until you refactor.

🤖 Prompt for AI Agents
In jupyter/tensorflow/ubi9-python-3.11/requirements.txt around lines 1525 to
1527, the narwhals package was upgraded to version 2.0.0, which introduces
breaking API changes. Review all code, notebooks, and libraries that use
narwhals to ensure compatibility with the new API, running their unit tests to
catch any silent errors. If immediate refactoring is not feasible, revert the
narwhals version to approximately 1.48 by pinning narwhals~=1.48 in the
requirements.txt to maintain compatibility.

@openshift-ci openshift-ci bot added size/xxl and removed size/xxl labels Jul 28, 2025
--hash=sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0 \
--hash=sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b \
--hash=sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7 \
--hash=sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe \

Check notice

Code scanning / Trivy

openssl: RFC7250 handshakes with unauthenticated servers don't abort as expected Low library

Package: cryptography
Installed Version: 43.0.3
Vulnerability CVE-2024-12797
Severity: LOW
Fixed Version: 44.0.1
Link: CVE-2024-12797
--hash=sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0 \
--hash=sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b \
--hash=sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7 \
--hash=sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe \

Check notice

Code scanning / Trivy

openssl: RFC7250 handshakes with unauthenticated servers don't abort as expected Low library

Package: cryptography
Installed Version: 43.0.3
Vulnerability CVE-2024-12797
Severity: LOW
Fixed Version: 44.0.1
Link: CVE-2024-12797
@openshift-ci openshift-ci bot added size/l and removed size/l labels Aug 6, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (8)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (4)

139-146: Image ends with USER 0; switch back to 1001
Best practice (Hadolint DL3002) and OpenShift expectations require the final layer to run as a non-root UID.

     fix-permissions /opt/app-root -P
+
+# Drop privileges
+USER 1001

Also applies to: 160-163


14-24: Remove --nogpgcheck; keep RPM signature verification intact
The blanket --nogpgcheck disables GPG validation for every package, weakening the supply-chain guarantees Checkov flags (CKV2_DOCKER_9). All packages come from signed Red Hat repos, so this flag is unnecessary.

-        dnf install -y --nogpgcheck --allowerasing --nobest $PACKAGES && \
+        dnf install -y --allowerasing --nobest $PACKAGES && \

80-83: Pin Apache Arrow revision for reproducible builds
git clone --depth 1 fetches moving HEAD; rebuilds aren’t deterministic.

-        git clone --depth 1 https://github.com/apache/arrow.git && \
+        ARROW_TAG=apache-arrow-16.1.0 && \
+        git clone --branch ${ARROW_TAG} --depth 1 https://github.com/apache/arrow.git && \

73-78: git missing from build deps – clone will fail
git clone is invoked two lines below, but git isn’t installed. The s390x wheel build will break immediately.

-        dnf install -y cmake make gcc-c++ pybind11-devel wget && \
+        dnf install -y cmake make gcc-c++ git pybind11-devel wget && \
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (4)

14-23: Duplicate of earlier feedback: --nogpgcheck should be removed to keep RPM signature verification.


74-78: Duplicate of earlier feedback: add git to build dependencies to avoid clone failure.


76-83: Duplicate of earlier feedback: pin Apache Arrow checkout to a tag/commit for reproducibility.


128-143: Duplicate of earlier feedback: final image still runs as root—switch back to USER 1001.

Also applies to: 158-160

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b319841 and be3f4b3.

📒 Files selected for processing (5)
  • ci/cached-builds/gen_gha_matrix_jobs.py (1 hunks)
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (3 hunks)
  • runtimes/datascience/ubi9-python-3.11/Pipfile (1 hunks)
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • runtimes/datascience/ubi9-python-3.12/Pipfile (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
  • ci/cached-builds/gen_gha_matrix_jobs.py
  • runtimes/datascience/ubi9-python-3.12/Pipfile
  • runtimes/datascience/ubi9-python-3.11/Pipfile
🧰 Additional context used
🧠 Learnings (50)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
📚 Learning: in the opendatahub-io/notebooks repository, when using multi-architecture dockerfiles with buildkit,...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: atheo89 requested github issue creation for multi-architecture dockerfile improvements during pr #12...
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: in the opendatahub-io/notebooks repository, modern docker with buildkit automatically provides build...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for openshift client architecture mapping problem affectin...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested pr review for #1521 covering s390x architecture support improvements, demonstrat...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for multi-architecture support in tensorflow cuda runtime ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for improving architecture detection in buildinputs tool d...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for multi-architecture support in rocm tensorflow image du...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for improving architecture detection in buildinputs tool d...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1364 was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for improving architecture detection in buildinputs tool d...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: in the opendatahub-io/notebooks repository, when adapting nvidia cuda dockerfiles, the project inten...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatah...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: in the opendatahub-io/notebooks repository, mixing centos packages with ubi base images is bad pract...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for pythonpath configuration investigation in runtime imag...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: the python 3.11 infrastructure for rocm tensorflow images in opendatahub-io/notebooks is already pro...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek prefers architectural solutions that eliminate problems entirely rather than just fixing i...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:138-140
Timestamp: 2025-08-01T16:07:58.701Z
Learning: jiridanek prefers architectural solutions that eliminate problems entirely rather than just fixing immediate technical issues. When presented with a pipeline safety concern about micropipenv requirements generation, he suggested removing micropipenv from the build process altogether by using pre-committed requirements.txt files, demonstrating preference for simplification and deterministic builds over complex workarounds.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: pipenv automatically detects non-interactive terminal environments (like github actions ci) and disa...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-28T13:57:48.243Z
Learning: Pipenv automatically detects non-interactive terminal environments (like GitHub Actions CI) and disables spinner animations without requiring PIPENV_NOSPIN=1. It checks for TTY connection and common CI environment variables to determine this behavior.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: in the opendatahub-io/notebooks repository, issue #1241 "security: add checksum verification for dow...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for shell script variable quoting security concern in code...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "improv...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for docker build robustness problem with rm glob patterns ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:132-134
Timestamp: 2025-07-08T19:07:58.135Z
Learning: jiridanek requested GitHub issue creation for Docker build robustness problem with rm glob patterns during PR #1306 review. Issue was created covering 11 affected Dockerfiles with fragile rm commands that fail on empty globs, comprehensive problem description, multiple solution options, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: grdryn corrected coderabbit's false assessment about openshift mirror server architecture support du...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: runtime deployment tests in opendatahub-io/notebooks may show podsecurity warnings about allowprivil...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for mssql repo file hardcoding problem during pr #1320 rev...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for improving architecture detection in buildinputs tool d...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for oc client installation permission problem in pytorch c...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda:17-25
Timestamp: 2025-07-09T08:07:30.628Z
Learning: jiridanek requested GitHub issue creation for oc client installation permission problem in PyTorch CUDA runtime Dockerfile during PR #1333 review. Issue #1356 was created with comprehensive problem description covering USER 1001 permission conflicts with root-owned /opt/app-root/bin directory, detailed impact analysis of build failures and non-executable binaries, current problematic code snippet, complete solution with user switching approach, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for passwd template validation in codeserver/ubi9-python-3...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template:14-14
Timestamp: 2025-07-03T16:28:55.757Z
Learning: jiridanek requested GitHub issue creation for passwd template validation in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template during PR #1269 review. Issue #1318 was created with comprehensive analysis of nss_wrapper approach advantages over OpenShift's native user management, including consistent user identity, application compatibility, and debugging benefits. The current approach works with OpenShift's random UID assignment by dynamically setting USER_ID and GROUP_ID to actual runtime values while providing enhanced user attributes through nss_wrapper.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for shell script error handling improvements in codeserver...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: issue #1212 in opendatahub-io/notebooks demonstrates that missing securitycontext configuration (all...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for version pinning improvement of micropipenv and uv pack...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for misleading cuda prefix in trustyai image tags during p...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for misleading cuda prefix in trustyai image tags during p...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for mssql repo file hardcoding problem during pr #1320 rev...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1357 was created with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 6 affected Dockerfiles, detailed root cause analysis, three solution options with code examples, clear acceptance criteria, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: in opendatahub-io/notebooks, python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: when analyzing openshift ci/prow job failures in opendatahub-io/notebooks, jobs that appear to "comp...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for runtime detection improvement of python site-packages ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek directs security concerns raised during pr reviews to existing comprehensive security issu...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:55-56
Timestamp: 2025-07-03T08:22:25.348Z
Learning: jiridanek directs security concerns raised during PR reviews to existing comprehensive security issues rather than addressing them in individual PRs. Issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" serves as the central tracking issue for all binary download security concerns across the opendatahub-io/notebooks repository.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for improving fragile sed-based jupyter kernel display_nam...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for shell script quality improvements in codeserver/ubi9-p...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for container startup robustness and lifecycle management ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requests github issue creation for shell script quality improvements identified during pr ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for container startup robustness and lifecycle management ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requests github issue creation for shell script quality improvements identified during pr ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:17-17
Timestamp: 2025-07-03T12:26:24.084Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for unquoted command substitution in codeserver/ubi9-python-3.12/run-code-server.sh. Issue #1283 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for shell script safety improvements identified during pr ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:5-5
Timestamp: 2025-07-03T12:25:26.453Z
Learning: jiridanek requested GitHub issue creation for shell script safety improvements identified during PR #1269 review, specifically for unsafe globbing patterns in codeserver/ubi9-python-3.12/run-code-server.sh. Issue #1281 was created with comprehensive problem descriptions, solution options, acceptance criteria, and proper context linking, following the established pattern for systematic tracking of technical improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for shell script safety improvement in codeserver/ubi9-pyt...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:23-23
Timestamp: 2025-07-03T12:29:24.067Z
Learning: jiridanek requested GitHub issue creation for shell script safety improvement in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review, specifically for replacing unsafe in-place file modification with tee. Issue #1285 was created with comprehensive problem descriptions, risk assessment, recommended atomic file operations solution, acceptance criteria, and proper context linking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for shell script strict-mode improvement in codeserver/ubi...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:1-2
Timestamp: 2025-07-03T16:00:46.191Z
Learning: jiridanek requested GitHub issue creation for shell script strict-mode improvement in codeserver/ubi9-python-3.12/utils/process.sh during PR #1269 review. Issue #1303 was created with comprehensive problem description covering silent failures and production risks, phased acceptance criteria for basic strict-mode implementation and enhanced error handling, implementation guidance with code examples and flag explanations, benefits section, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for bootstrapper code duplication problem in runtimes/rocm...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for jupyter-client dependency pinning inconsistency during...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: jiridanek requested github issue creation for rstudio py311 tekton push pipelines during pr #1379 re...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
🪛 Checkov (3.2.334)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[MEDIUM] 14-24: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[MEDIUM] 14-23: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

🪛 GitHub Actions: Code static analysis
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[warning] 27-27: Hadolint info SC2016: Expressions don't expand in single quotes, use double quotes for that.


[warning] 69-69: Hadolint warning DL3002: Last USER should not be root.


[warning] 139-139: Hadolint warning DL3002: Last USER should not be root.

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[warning] 26-26: Hadolint info SC2016: Expressions don't expand in single quotes, use double quotes for that.


[warning] 64-64: Hadolint warning DL3002: Last USER should not be root.


[warning] 128-128: Hadolint warning DL3002: Last USER should not be root.

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py311-ubi9-on-pull-request
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build

Comment on lines +26 to +40
# For s390x only, set ENV vars and install Rust
RUN if [ "$TARGETARCH" = "s390x" ]; then \
# Install Rust and set up environment
mkdir -p /opt/.cargo && \
export HOME=/root && \
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
chmod +x rustup-init.sh && \
CARGO_HOME=/opt/.cargo HOME=/root ./rustup-init.sh -y --no-modify-path && \
rm -f rustup-init.sh && \
chown -R 1001:0 /opt/.cargo && \
# Set environment variables
echo 'export PATH=/opt/.cargo/bin:$PATH' >> /etc/profile.d/cargo.sh && \
echo 'export CARGO_HOME=/opt/.cargo' >> /etc/profile.d/cargo.sh && \
echo 'export GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1' >> /etc/profile.d/cargo.sh; \
fi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Download-and-execute of rustup-init.sh lacks integrity checks
Executing a remote script without verifying a checksum is a classic supply-chain risk and contradicts repo policy tracked in issue #1241. At minimum verify the SHA256 or import the official GPG key.

-    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
+    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
+    echo "<EXPECTED_SHA256>  rustup-init.sh" | sha256sum -c - && \
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# For s390x only, set ENV vars and install Rust
RUN if [ "$TARGETARCH" = "s390x" ]; then \
# Install Rust and set up environment
mkdir -p /opt/.cargo && \
export HOME=/root && \
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
chmod +x rustup-init.sh && \
CARGO_HOME=/opt/.cargo HOME=/root ./rustup-init.sh -y --no-modify-path && \
rm -f rustup-init.sh && \
chown -R 1001:0 /opt/.cargo && \
# Set environment variables
echo 'export PATH=/opt/.cargo/bin:$PATH' >> /etc/profile.d/cargo.sh && \
echo 'export CARGO_HOME=/opt/.cargo' >> /etc/profile.d/cargo.sh && \
echo 'export GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1' >> /etc/profile.d/cargo.sh; \
fi
# For s390x only, set ENV vars and install Rust
RUN if [ "$TARGETARCH" = "s390x" ]; then \
# Install Rust and set up environment
mkdir -p /opt/.cargo && \
export HOME=/root && \
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs -o rustup-init.sh && \
echo "<EXPECTED_SHA256> rustup-init.sh" | sha256sum -c - && \
chmod +x rustup-init.sh && \
CARGO_HOME=/opt/.cargo HOME=/root ./rustup-init.sh -y --no-modify-path && \
rm -f rustup-init.sh && \
chown -R 1001:0 /opt/.cargo && \
# Set environment variables
echo 'export PATH=/opt/.cargo/bin:$PATH' >> /etc/profile.d/cargo.sh && \
echo 'export CARGO_HOME=/opt/.cargo' >> /etc/profile.d/cargo.sh && \
echo 'export GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1' >> /etc/profile.d/cargo.sh; \
fi
🧰 Tools
🪛 GitHub Actions: Code static analysis

[warning] 27-27: Hadolint info SC2016: Expressions don't expand in single quotes, use double quotes for that.

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu around lines 26 to 40,
the rustup-init.sh script is downloaded and executed without verifying its
integrity, which poses a security risk. To fix this, add a step to verify the
SHA256 checksum of the downloaded rustup-init.sh file against the official
checksum before executing it. Alternatively, import and verify the script using
the official GPG key if available. Only proceed with execution if the
verification passes, otherwise fail the build.

@openshift-ci openshift-ci bot added size/l and removed size/l labels Aug 6, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🔭 Outside diff range comments (1)
runtimes/datascience/ubi9-python-3.12/requirements.txt (1)

458-465: Vulnerable cryptography pinned at 43.0.3

CVE-2024-12797 (TLS RFC7250) is still open for 43.x.
Upstream patched versions start with 44.0.1. Bump and re-hash to avoid shipping a known-vulnerable library.

-cryptography==43.0.3; platform_machine != 's390x' \
+cryptography==44.0.1; platform_machine != 's390x' \
♻️ Duplicate comments (2)
runtimes/datascience/ubi9-python-3.12/requirements.txt (2)

2362-2364: six==1.17.0 phantom release still present
Same issue was raised earlier; upstream latest is 1.16.0. 1.17.0 is yanked and breaks reproducibility.


206-213: Remove unnecessary platform_machine != 's390x' exclusions

Both cachetools==5.5.2 and certifi==2025.8.3 publish universal py3-none-any.whl on PyPI, and cffi==1.17.1 now ships s390x manylinux2014 wheels. The existing exclusion:

- cachetools==5.5.2; platform_machine != 's390x' \
+ cachetools==5.5.2 \

should likewise be removed for certifi and cffi. Only keep platform_machine != 's390x' on packages that truly lack s390x wheels (e.g. bcrypt, pyarrow).

• runtimes/datascience/ubi9-python-3.12/requirements.txt (lines 206–213):

  • Drop ; platform_machine != 's390x' from the entries for cachetools==5.5.2, certifi==2025.8.3, and cffi==1.17.1.
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between be3f4b3 and c6b2b09.

⛔ Files ignored due to path filters (2)
  • jupyter/trustyai/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (2)
  • jupyter/trustyai/ubi9-python-3.12/requirements.txt (1 hunks)
  • runtimes/datascience/ubi9-python-3.12/requirements.txt (24 hunks)
✅ Files skipped from review due to trivial changes (1)
  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
🧰 Additional context used
🧠 Learnings (38)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
📚 Learning: grdryn corrected coderabbit's false assessment about cuda companion package wheel availability durin...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek identified that arm64 wheels for h5py 3.14.0 are available on pypi but being ignored due t...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for runtime detection improvement of python site-packages ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested pr review for #1521 covering s390x architecture support improvements, demonstrat...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatah...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for runtime detection improvement of python site-packages ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: in opendatahub-io/notebooks, python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: when checking pypi wheel availability for aarch64 architecture, the actual wheel filenames use "many...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: the tensorflow rocm python 3.12 compatibility issue in opendatahub-io/notebooks pr #1259 was caused ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for jupyter-client dependency pinning inconsistency during...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for version pinning improvement of micropipenv and uv pack...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding pipfiles durin...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: the jupyter-bokeh pinning to 3.0.5 in trustyai notebook image was not due to trustyai code compatibi...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: the jupyter-bokeh package was previously pinned to version 3.0.5 in the trustyai notebook image due ...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: trustyai 0.6.1 (latest version as of june 2025) has a hard dependency constraint on jupyter-bokeh~=3...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: trustyai explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.tx...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: trustyai's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with trustyai's visua...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for misleading cuda prefix in trustyai image tags during p...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for multi-platform dependency locking investigation during...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-21T15:06:04.114Z
Learning: jiridanek requested GitHub issue creation for multi-platform dependency locking investigation during PR #1396 review. Issue #1423 was successfully created with comprehensive problem description covering ARM64 wheel availability but being ignored due to AMD64-only dependency locking, root cause analysis of platform-specific pipenv limitations, immediate conditional installation solution, multi-platform locking ecosystem analysis, broader affected areas investigation, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for mssql repo file hardcoding problem during pr #1320 rev...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: jiridanek requested github issue creation for misleading cuda prefix in trustyai image tags during p...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: the python 3.11 infrastructure for rocm tensorflow images in opendatahub-io/notebooks is already pro...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide python 3.12 wheels (cp...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: in the opendatahub-io/notebooks repository, the test runner's python version (configured in github a...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: in the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pat...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: in the opendatahub-io/notebooks repository, tensorflow packages with `extras = ["and-cuda"]` can cau...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: in the opendatahub-io/notebooks repository, when adapting nvidia cuda dockerfiles, the project inten...
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: runtime deployment tests in opendatahub-io/notebooks may show podsecurity warnings about allowprivil...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: tensorflow_rocm package has no python 3.12 or 3.13 wheel support as of july 2025, with the latest ve...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: tensorflow_rocm package has no python 3.12 or 3.13 wheel support as of july 2025, with the latest ve...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: rocm is not supported by upstream tensorflow, unlike cuda. rocm support requires the specialized ten...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: in the opendatahub-io/notebooks repository, issue #1241 "security: add checksum verification for dow...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: the red hat openshift ai (rhoai) supported configurations officially require openshift 4.14+ for all...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1607
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml:2-2
Timestamp: 2025-08-02T15:19:34.383Z
Learning: The Red Hat OpenShift AI (RHOAI) supported configurations officially require OpenShift 4.14+ for all currently supported RHOAI versions (2.22, 2.19, 2.16), which corresponds to kubectl 1.27+. This means all supported RHOAI deployments will show Kustomize API deprecation warnings when using v1alpha1 API version, making the update to v1 essential rather than optional. The official support matrix is documented at https://access.redhat.com/articles/rhoai-supported-configs.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: psutil version 7.x is compatible with ubi9, centos stream 9, and rhel 9 platforms in the opendatahub...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1197
File: runtimes/minimal/ubi9-python-3.11/requirements.txt:395-405
Timestamp: 2025-06-26T15:28:35.416Z
Learning: psutil version 7.x is compatible with UBI9, CentOS Stream 9, and RHEL 9 platforms in the opendatahub-io/notebooks repository. The upgrade from psutil 5.x to 7.x has been validated for these environments.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: pytorch 2.6.0 wheels with cuda 12.4 support (torch-2.6.0+cu124) are available on the official pytorc...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1463
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:15-15
Timestamp: 2025-07-22T14:31:20.315Z
Learning: PyTorch 2.6.0 wheels with CUDA 12.4 support (torch-2.6.0+cu124) are available on the official PyTorch CUDA index at https://download.pytorch.org/whl/cu124/ for Python versions 3.9, 3.10, 3.11, 3.12, and 3.13 on both Linux x86_64 and Windows AMD64 platforms.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: odh-elyra uses universal wheels (py3-none-any) rather than platform-specific wheels, making it compa...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1325
File: jupyter/datascience/ubi9-python-3.11/Pipfile:32-32
Timestamp: 2025-07-07T14:54:57.907Z
Learning: odh-elyra uses universal wheels (py3-none-any) rather than platform-specific wheels, making it compatible with all supported Python versions through a single wheel file.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: current practice for runtime-images json files in opendatahub-io/notebooks: all runtime image config...
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T15:56:20.036Z
Learning: Current practice for runtime-images JSON files in opendatahub-io/notebooks: All runtime image configurations use SHA256 digest pinning (format: `sha256:<digest>`) instead of mutable tags for immutability, use `quay.io/opendatahub/workbench-images` registry, include `schema_name: "runtime-image"` field, and follow consistent structure as seen in existing files like datascience-ubi9-py311.json, pytorch-ubi9-py311.json, etc.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py311-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
🔇 Additional comments (1)
runtimes/datascience/ubi9-python-3.12/requirements.txt (1)

1656-1663: 👍 Correct gate for py-spy

No s390x wheels are published; gating is appropriate.

Comment on lines 2124 to 2130
requests-oauthlib==2.0.0; platform_machine != 's390x' \
--hash=sha256:7dd8a5c40426b779b0868c404bdef9768deccf22749cde15852df527e6269b36 \
--hash=sha256:b3dffaebd884d8cd778494369603a9e7b58d29111bf6b41bdc2dcd87203af4e9
rich==13.9.4; python_full_version >= '3.8.0' \
rich==13.9.4; platform_machine != 's390x' \
--hash=sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098 \
--hash=sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90
rpds-py==0.26.0; python_version >= '3.9' \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Over-gating of additional pure-python packages

requests-oauthlib, rich, and other entries are now skipped on s390x although they publish universal wheels.
This increases drift and future merge-conflict surface.

Please audit all newly added platform_machine != 's390x' markers and keep them only where wheels are missing or build steps are provided in the Dockerfile (e.g. pyarrow).

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/requirements.txt around lines 2124 to
2130, the platform_machine != 's390x' marker is applied too broadly to packages
like requests-oauthlib and rich, which actually provide universal wheels for
s390x. Review all newly added platform_machine != 's390x' conditions and remove
them for packages that have compatible wheels on s390x, keeping these markers
only for packages that lack wheels or require specific build steps in the
Dockerfile, such as pyarrow.

@Nash-123 Nash-123 force-pushed the runtime-datascience-s390x branch from c6b2b09 to 12c97e9 Compare August 8, 2025 16:32
@openshift-ci openshift-ci bot added size/l and removed size/l labels Aug 8, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🔭 Outside diff range comments (1)
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (1)

4-4: Pin the base image – avoid :latest.

ubi9/python-311:latest should be replaced with a digest or fixed tag to ensure reproducible builds (see issue #1242).

♻️ Duplicate comments (10)
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (5)

14-24: Same --nogpgcheck supply-chain risk previously flagged – still unresolved.


31-34: Running the remote rustup-init.sh without checksum verification was already called out and remains unaddressed.


78-82: git is still missing from build-deps, so the clone will fail on s390x – see prior feedback.


81-84: git clone remains unpinned; reproducibility concern previously reported.


140-147: Container still ends as USER 0; earlier comment about switching back to 1001 not addressed.

runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (5)

22-23: Unverified --nogpgcheck usage – previously flagged, still present.


29-32: rustup-init.sh executed without checksum – prior security concern remains.


75-77: git still absent in build dependencies causing clone failure – duplicates earlier feedback.


77-80: Apache Arrow clone is still unpinned – reproducibility issue already reported.


129-144: Final image still runs as root; earlier request to return to user 1001 is unresolved.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c6b2b09 and 12c97e9.

📒 Files selected for processing (5)
  • ci/cached-builds/gen_gha_matrix_jobs.py (1 hunks)
  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu (3 hunks)
  • runtimes/datascience/ubi9-python-3.11/Pipfile (1 hunks)
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (3 hunks)
  • runtimes/datascience/ubi9-python-3.12/Pipfile (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
  • ci/cached-builds/gen_gha_matrix_jobs.py
  • runtimes/datascience/ubi9-python-3.11/Pipfile
  • runtimes/datascience/ubi9-python-3.12/Pipfile
🧰 Additional context used
🧠 Learnings (56)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt:2659-2680
Timestamp: 2025-08-07T12:41:48.997Z
Learning: For opendatahub-io/notebooks, rpds-py 0.27.0 provides manylinux wheels for Python 3.11 and 3.12 on x86_64 and aarch64, so no Rust build is needed for these platforms. For s390x and ppc64le, wheels are not available, so a Rust build stage or version pinning is required if those images are built.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.
📚 Learning: 2025-07-04T17:08:02.399Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-07T11:08:48.524Z
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-09T12:29:56.162Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-04T17:07:52.656Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-05T17:24:08.616Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-10T15:02:13.228Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-09T12:31:02.033Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:30:20.513Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-07T12:41:48.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt:2659-2680
Timestamp: 2025-08-07T12:41:48.997Z
Learning: For opendatahub-io/notebooks, rpds-py 0.27.0 provides manylinux wheels for Python 3.11 and 3.12 on x86_64 and aarch64, so no Rust build is needed for these platforms. For s390x and ppc64le, wheels are not available, so a Rust build stage or version pinning is required if those images are built.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:30:01.738Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-02T18:19:23.024Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-04T10:41:13.061Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-18T19:01:39.811Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda:192-195
Timestamp: 2025-07-18T19:01:39.811Z
Learning: In the opendatahub-io/notebooks repository, mixing CentOS packages with UBI base images is bad practice that removes supportability and creates "Frankenstein" images according to Red Hat guidance. However, using EPEL packages is acceptable, though it may require extra work with AIPCC for internal Red Hat builds. The official reference is at developers.redhat.com/articles/ubi-faq.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-06-16T11:32:09.203Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-09T14:02:39.880Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T14:02:39.880Z
Learning: grdryn corrected CodeRabbit's false assessment about OpenShift mirror server architecture support during PR #1320 review. The OpenShift mirror server (https://mirror.openshift.com/pub/openshift-v4/) actually supports both aarch64 and arm64 directory structures, making the current code using $(uname -m) work correctly across architectures without modification. This demonstrates the importance of verifying external service capabilities before flagging issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T16:17:23.065Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-06T15:33:47.175Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:33:47.175Z
Learning: During PR #968 review, CodeRabbit initially incorrectly identified 1 legitimate micropipenv usage in jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda when analyzing Python 3.12 images for unused dependencies. Upon jiridanek's request for re-verification, comprehensive analysis revealed all 15 Python 3.12 Dockerfiles install micropipenv but none actually use it, making the cleanup scope 100% unnecessary installations with no exceptions to handle.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T08:22:25.348Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:55-56
Timestamp: 2025-07-03T08:22:25.348Z
Learning: jiridanek directs security concerns raised during PR reviews to existing comprehensive security issues rather than addressing them in individual PRs. Issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" serves as the central tracking issue for all binary download security concerns across the opendatahub-io/notebooks repository.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-09T08:07:30.628Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda:17-25
Timestamp: 2025-07-09T08:07:30.628Z
Learning: jiridanek requested GitHub issue creation for oc client installation permission problem in PyTorch CUDA runtime Dockerfile during PR #1333 review. Issue #1356 was created with comprehensive problem description covering USER 1001 permission conflicts with root-owned /opt/app-root/bin directory, detailed impact analysis of build failures and non-executable binaries, current problematic code snippet, complete solution with user switching approach, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T16:05:35.448Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T16:28:55.757Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template:14-14
Timestamp: 2025-07-03T16:28:55.757Z
Learning: jiridanek requested GitHub issue creation for passwd template validation in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template during PR #1269 review. Issue #1318 was created with comprehensive analysis of nss_wrapper approach advantages over OpenShift's native user management, including consistent user identity, application compatibility, and debugging benefits. The current approach works with OpenShift's random UID assignment by dynamically setting USER_ID and GROUP_ID to actual runtime values while providing enhanced user attributes through nss_wrapper.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-04T05:52:49.464Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-03T16:04:22.695Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-03T14:01:22.819Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-03T12:08:47.691Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T14:01:22.819Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-03T12:26:24.084Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:17-17
Timestamp: 2025-07-03T12:26:24.084Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for unquoted command substitution in codeserver/ubi9-python-3.12/run-code-server.sh. Issue #1283 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T12:25:26.453Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:5-5
Timestamp: 2025-07-03T12:25:26.453Z
Learning: jiridanek requested GitHub issue creation for shell script safety improvements identified during PR #1269 review, specifically for unsafe globbing patterns in codeserver/ubi9-python-3.12/run-code-server.sh. Issue #1281 was created with comprehensive problem descriptions, solution options, acceptance criteria, and proper context linking, following the established pattern for systematic tracking of technical improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T12:29:24.067Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:23-23
Timestamp: 2025-07-03T12:29:24.067Z
Learning: jiridanek requested GitHub issue creation for shell script safety improvement in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review, specifically for replacing unsafe in-place file modification with tee. Issue #1285 was created with comprehensive problem descriptions, risk assessment, recommended atomic file operations solution, acceptance criteria, and proper context linking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T16:00:46.191Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:1-2
Timestamp: 2025-07-03T16:00:46.191Z
Learning: jiridanek requested GitHub issue creation for shell script strict-mode improvement in codeserver/ubi9-python-3.12/utils/process.sh during PR #1269 review. Issue #1303 was created with comprehensive problem description covering silent failures and production risks, phased acceptance criteria for basic strict-mode implementation and enhanced error handling, implementation guidance with code examples and flag explanations, benefits section, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-08-01T16:10:21.049Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-04T05:49:10.314Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:09:48.746Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:09:48.746Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:35:49.482Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-08T19:26:17.140Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-11T11:54:28.202Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-07T12:39:01.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/tensorflow/ubi9-python-3.12/requirements.txt:1531-1558
Timestamp: 2025-08-07T12:39:01.997Z
Learning: In opendatahub-io/notebooks, the ROCm TensorFlow Python 3.12 UBI9 image (runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm) was missing libxcrypt-compat, which is required for MySQL SASL2 plugin authentication with mysql-connector-python==9.3.0 on Python 3.12 UBI9. Issue #1722 was created to track this, following the established pattern for systematic dependency consistency and runtime compatibility across all Python 3.12 UBI9 images.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-11T11:16:05.131Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-08T19:28:15.791Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
📚 Learning: 2025-07-09T11:22:11.653Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-03T07:03:45.020Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-01T10:41:56.419Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-08T08:38:32.988Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1760
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/Dockerfile.cuda:241-244
Timestamp: 2025-08-08T08:38:32.988Z
Learning: Policy: In opendatahub-io/notebooks, do not remove micropipenv from Python 3.11 images to avoid breaking existing users. Cleanups can deduplicate duplicate installs and remove unused uv, but micropipenv must remain present in 3.11 images.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-07T12:38:55.700Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:2466-2621
Timestamp: 2025-08-07T12:38:55.700Z
Learning: rpds-py version 0.27.0 provides manylinux wheels for Python 3.11 (cp311) and Python 3.12 (cp312) on PyPI, so it does not require a Rust build during image creation in opendatahub-io/notebooks.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu
  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-10T15:02:13.228Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1364 was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-10T15:02:13.228Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue was created with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-06T15:22:17.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:22:17.997Z
Learning: jiridanek requested GitHub issue creation for micropipenv removal from Python 3.12 images during PR #968 review. Analysis revealed 15 affected Dockerfiles with only 1 legitimate usage (jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda using micropipenv install --dev) while 14 images install micropipenv unnecessarily. Issue #1685 was created with comprehensive problem description covering ~238MB total waste across unused installations, detailed scope analysis, multiple solution options, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:21:11.512Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-01T16:07:58.701Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:138-140
Timestamp: 2025-08-01T16:07:58.701Z
Learning: jiridanek prefers architectural solutions that eliminate problems entirely rather than just fixing immediate technical issues. When presented with a pipeline safety concern about micropipenv requirements generation, he suggested removing micropipenv from the build process altogether by using pre-committed requirements.txt files, demonstrating preference for simplification and deterministic builds over complex workarounds.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-06-28T13:57:48.243Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-28T13:57:48.243Z
Learning: Pipenv automatically detects non-interactive terminal environments (like GitHub Actions CI) and disables spinner animations without requiring PIPENV_NOSPIN=1. It checks for TTY connection and common CI environment variables to determine this behavior.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-08T19:07:58.135Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:132-134
Timestamp: 2025-07-08T19:07:58.135Z
Learning: jiridanek requested GitHub issue creation for Docker build robustness problem with rm glob patterns during PR #1306 review. Issue was created covering 11 affected Dockerfiles with fragile rm commands that fail on empty globs, comprehensive problem description, multiple solution options, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-10T15:39:23.433Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-08-06T15:19:32.488Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm:79-88
Timestamp: 2025-08-06T15:19:32.488Z
Learning: jiridanek corrected CodeRabbit's false assessment about uv pip install --index-strategy=unsafe-best-match flag during PR #968 review. The flag controls index selection strategy for choosing between multiple package registries (like PyPI and supplementary registries), not hash verification behavior as incorrectly suggested by CodeRabbit.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
📚 Learning: 2025-07-09T12:31:02.033Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1357 was created with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 6 affected Dockerfiles, detailed root cause analysis, three solution options with code examples, clear acceptance criteria, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
🪛 Checkov (3.2.334)
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[MEDIUM] 14-23: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[MEDIUM] 14-24: Ensure that packages with untrusted or missing GPG signatures are not used by dnf, tdnf, or yum via the '--nogpgcheck' option

(CKV2_DOCKER_9)

🪛 GitHub Actions: Code static analysis
runtimes/datascience/ubi9-python-3.11/Dockerfile.cpu

[warning] 26-26: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.


[warning] 65-65: Hadolint DL3002 warning: Last USER should not be root.


[warning] 129-129: Hadolint DL3002 warning: Last USER should not be root.

runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu

[warning] 27-27: Hadolint SC2016 info: Expressions don't expand in single quotes, use double quotes for that.


[warning] 70-70: Hadolint DL3002 warning: Last USER should not be root.


[warning] 140-140: Hadolint DL3002 warning: Last USER should not be root.

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py311-ubi9-on-pull-request
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build

@@ -3,13 +3,48 @@
####################
FROM registry.access.redhat.com/ubi9/python-312:latest AS base
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Pin the base image – avoid :latest.

Using ubi9/python-312:latest makes rebuilds non-deterministic and may silently pull incompatible minors. Please pin to a specific digest or tag (tracked in issue #1242).

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu at line 4, the base
image is currently using the `:latest` tag, which can cause non-deterministic
builds and potential incompatibilities. Replace `ubi9/python-312:latest` with a
specific, fixed tag or digest version to ensure consistent and reproducible
builds. Check the relevant issue #1242 for the recommended pinned version to
use.

@Nash-123
Copy link
Contributor Author

Nash-123 commented Aug 8, 2025

Hi @atheo89/ @jiridanek ,
I’ve rebased this PR with the latest changes.
If everything looks good from your side, could you please approve it?
Thanks!

@openshift-ci openshift-ci bot added size/l and removed size/l labels Aug 8, 2025
@openshift-ci openshift-ci bot added size/xl and removed size/l labels Aug 8, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🔭 Outside diff range comments (1)
runtimes/datascience/ubi9-python-3.12/requirements.txt (1)

458-485: Bump cryptography to 44.0.1 to address CVE-2024-12797; retain s390x guard

We’ve confirmed that version 44.0.1 lacks any s390x or “py3-none-any” wheels on PyPI, so the existing
platform_machine != 's390x' condition must stay in place.

Please update and re-lock as follows:

• In Pipfile (and requirements.txt if duplicated):
– Change
cryptography = "==43.0.3; platform_machine != 's390x'"
– To
cryptography = "==44.0.1; platform_machine != 's390x'"

• Run pipenv lock (or equivalent) to regenerate your lockfile.

♻️ Duplicate comments (3)
runtimes/datascience/ubi9-python-3.12/requirements.txt (3)

2373-2375: Pin ‘six’ to 1.16.0 (1.17.0 is phantom) and drop s390x gate

Upstream latest is 1.16.0; 1.17.0 harms reproducibility. Also no reason to gate six on s390x.

-six==1.17.0; platform_machine != 's390x' \
+six==1.16.0 \

2124-2126: OAuth libs and rich: remove s390x gating (universal wheels)

requests-oauthlib, oauthlib, and rich are pure-Python; drop the markers for parity.

Also applies to: 2127-2129


206-208: Cachetools/cffi: remove s390x gating; wheels exist

  • cachetools is py3-none-any (universal).
  • cffi 1.17.1 publishes manylinux s390x wheels.
    Drop the platform gate to avoid needless feature loss on s390x. Update Pipfile and re-lock.
-cachetools==5.5.2; platform_machine != 's390x' \
+cachetools==5.5.2 \
-cffi==1.17.1; platform_machine != 's390x' \
+cffi==1.17.1 \

Also applies to: 212-279

🧹 Nitpick comments (1)
runtimes/datascience/ubi9-python-3.12/requirements.txt (1)

1197-1224: Confirm libxcrypt-compat is installed in Dockerfile for mysql-connector-python

mysql-connector-python on Py3.12 UBI9 needs libxcrypt-compat for MySQL SASL2 plugin auth. Ensure the Dockerfile installs it.

I can raise a follow-up PR to add:

  • dnf install -y libxcrypt-compat
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 12c97e9 and c589816.

⛔ Files ignored due to path filters (2)
  • jupyter/trustyai/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.12/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (2)
  • jupyter/trustyai/ubi9-python-3.12/requirements.txt (2 hunks)
  • runtimes/datascience/ubi9-python-3.12/requirements.txt (24 hunks)
🧰 Additional context used
🧠 Learnings (48)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py:619-626
Timestamp: 2025-07-08T19:33:14.340Z
Learning: jiridanek requested GitHub issue creation for Python 3.12 version check bug in bootstrapper.py during PR #1333 review. Issue #1348 was created with comprehensive problem description covering version check exclusion affecting all Python 3.12 runtime images, detailed impact analysis of bootstrapper execution failures, clear solution with code examples, affected files list including all 6 runtime bootstrapper copies, acceptance criteria for testing and verification, implementation notes about code duplication and upstream reporting, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt:2659-2680
Timestamp: 2025-08-07T12:41:48.997Z
Learning: For opendatahub-io/notebooks, rpds-py 0.27.0 provides manylinux wheels for Python 3.11 and 3.12 on x86_64 and aarch64, so no Rust build is needed for these platforms. For s390x and ppc64le, wheels are not available, so a Rust build stage or version pinning is required if those images are built.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:2466-2621
Timestamp: 2025-08-07T12:38:55.700Z
Learning: rpds-py version 0.27.0 provides manylinux wheels for Python 3.11 (cp311) and Python 3.12 (cp312) on PyPI, so it does not require a Rust build during image creation in opendatahub-io/notebooks.
📚 Learning: 2025-06-28T14:15:41.168Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-06-28T14:21:09.429Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-06-28T14:21:09.429Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-06-28T14:15:41.168Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-06-28T14:21:09.429Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-23T16:18:42.922Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-08T19:09:48.746Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-24T12:01:45.188Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-16T00:17:10.313Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-08T19:09:48.746Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-08T19:26:17.140Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-09T08:12:05.822Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires `urllib3<3,>=1.21.1` (not `urllib3<2` as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-04T06:05:30.580Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-03T13:59:55.040Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/requirements.txt:435-444
Timestamp: 2025-07-03T13:59:55.040Z
Learning: jiridanek requested GitHub issue creation for numpy/scipy compatibility investigation in PR #1269, specifically for cases where theoretical version conflicts don't manifest as actual build failures. Issue #1297 was created with comprehensive investigation framework, acceptance criteria, runtime testing approach, and proper context linking, demonstrating the pattern of creating investigation-type issues for apparent but non-blocking technical concerns that require deeper understanding.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-07T12:37:47.314Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/trustyai/ubi9-python-3.12/requirements.txt:546-572
Timestamp: 2025-08-07T12:37:47.314Z
Learning: In the opendatahub-io/notebooks repository, strict alignment of debugpy versions across all images is not enforced as a policy. Version drift is tolerated and dependency versions are primarily managed by Pipenv locking. Alignment is only considered when inconsistencies are surfaced by tooling or cause issues.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-08T08:38:32.988Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1760
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/Dockerfile.cuda:241-244
Timestamp: 2025-08-08T08:38:32.988Z
Learning: Policy: In opendatahub-io/notebooks, do not remove micropipenv from Python 3.11 images to avoid breaking existing users. Cleanups can deduplicate duplicate installs and remove unused uv, but micropipenv must remain present in 3.11 images.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-03T07:03:45.020Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-01T10:41:56.419Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-08T14:57:20.636Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1782
File: manifests/overlays/additional/jupyter-trustyai-cpu-py312-ubi9-imagestream.yaml:1-1
Timestamp: 2025-08-08T14:57:20.636Z
Learning: In opendatahub-io/notebooks, when validating imagestream manifest dependency versions, compare manifest major.minor values against exact pinned versions in requirements.txt/Pipfile.lock for the same image; manifests intentionally omit patch numbers (project convention).

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-07T12:38:55.700Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:2466-2621
Timestamp: 2025-08-07T12:38:55.700Z
Learning: rpds-py version 0.27.0 provides manylinux wheels for Python 3.11 (cp311) and Python 3.12 (cp312) on PyPI, so it does not require a Rust build during image creation in opendatahub-io/notebooks.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-20T20:47:36.509Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-03T07:03:45.020Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-03T07:05:33.329Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-03T07:05:33.329Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-22T14:31:20.315Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1463
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:15-15
Timestamp: 2025-07-22T14:31:20.315Z
Learning: PyTorch 2.6.0 wheels with CUDA 12.4 support (torch-2.6.0+cu124) are available on the official PyTorch CUDA index at https://download.pytorch.org/whl/cu124/ for Python versions 3.9, 3.10, 3.11, 3.12, and 3.13 on both Linux x86_64 and Windows AMD64 platforms.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.12/requirements.txt
  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-07T12:39:01.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/tensorflow/ubi9-python-3.12/requirements.txt:1531-1558
Timestamp: 2025-08-07T12:39:01.997Z
Learning: In opendatahub-io/notebooks, the ROCm TensorFlow Python 3.12 UBI9 image (runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm) was missing libxcrypt-compat, which is required for MySQL SASL2 plugin authentication with mysql-connector-python==9.3.0 on Python 3.12 UBI9. Issue #1722 was created to track this, following the established pattern for systematic dependency consistency and runtime compatibility across all Python 3.12 UBI9 images.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-08T19:28:15.791Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-06T15:33:47.175Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:33:47.175Z
Learning: During PR #968 review, CodeRabbit initially incorrectly identified 1 legitimate micropipenv usage in jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda when analyzing Python 3.12 images for unused dependencies. Upon jiridanek's request for re-verification, comprehensive analysis revealed all 15 Python 3.12 Dockerfiles install micropipenv but none actually use it, making the cleanup scope 100% unnecessary installations with no exceptions to handle.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-08T19:28:15.791Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-16T00:17:10.313Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-11T11:54:28.202Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-01T16:10:21.049Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-06T15:22:17.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:22:17.997Z
Learning: jiridanek requested GitHub issue creation for micropipenv removal from Python 3.12 images during PR #968 review. Analysis revealed 15 affected Dockerfiles with only 1 legitimate usage (jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda using micropipenv install --dev) while 14 images install micropipenv unnecessarily. Issue #1685 was created with comprehensive problem description covering ~238MB total waste across unused installations, detailed scope analysis, multiple solution options, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-05T17:24:08.616Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-21T15:06:04.114Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-21T15:06:04.114Z
Learning: jiridanek requested GitHub issue creation for multi-platform dependency locking investigation during PR #1396 review. Issue #1423 was successfully created with comprehensive problem description covering ARM64 wheel availability but being ignored due to AMD64-only dependency locking, root cause analysis of platform-specific pipenv limitations, immediate conditional installation solution, multi-platform locking ecosystem analysis, broader affected areas investigation, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-08T19:35:49.482Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-07T12:41:48.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt:2659-2680
Timestamp: 2025-08-07T12:41:48.997Z
Learning: For opendatahub-io/notebooks, rpds-py 0.27.0 provides manylinux wheels for Python 3.11 and 3.12 on x86_64 and aarch64, so no Rust build is needed for these platforms. For s390x and ppc64le, wheels are not available, so a Rust build stage or version pinning is required if those images are built.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-01T14:36:52.852Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-06-28T14:13:27.890Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-08T14:48:51.662Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1782
File: manifests/overlays/additional/jupyter-trustyai-cpu-py312-ubi9-imagestream.yaml:28-28
Timestamp: 2025-08-08T14:48:51.662Z
Learning: In opendatahub-io/notebooks, imagestream manifests (e.g., manifests/*-imagestream.yaml) intentionally list dependency versions in opendatahub.io/notebook-python-dependencies as major.minor only (omit patch). Review suggestions should not ask to change these to exact patch versions; keep major.minor to match project convention.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-01T10:41:56.419Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-14T15:36:29.815Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-02T18:19:23.024Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-08-02T15:19:34.383Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1607
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml:2-2
Timestamp: 2025-08-02T15:19:34.383Z
Learning: The Red Hat OpenShift AI (RHOAI) supported configurations officially require OpenShift 4.14+ for all currently supported RHOAI versions (2.22, 2.19, 2.16), which corresponds to kubectl 1.27+. This means all supported RHOAI deployments will show Kustomize API deprecation warnings when using v1alpha1 API version, making the update to v1 essential rather than optional. The official support matrix is documented at https://access.redhat.com/articles/rhoai-supported-configs.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-04T10:41:13.061Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-06-26T15:28:35.416Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1197
File: runtimes/minimal/ubi9-python-3.11/requirements.txt:395-405
Timestamp: 2025-06-26T15:28:35.416Z
Learning: psutil version 7.x is compatible with UBI9, CentOS Stream 9, and RHEL 9 platforms in the opendatahub-io/notebooks repository. The upgrade from psutil 5.x to 7.x has been validated for these environments.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
📚 Learning: 2025-07-29T15:56:20.036Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T15:56:20.036Z
Learning: Current practice for runtime-images JSON files in opendatahub-io/notebooks: All runtime image configurations use SHA256 digest pinning (format: `sha256:<digest>`) instead of mutable tags for immutability, use `quay.io/opendatahub/workbench-images` registry, include `schema_name: "runtime-image"` field, and follow consistent structure as seen in existing files like datascience-ubi9-py311.json, pytorch-ubi9-py311.json, etc.

Applied to files:

  • runtimes/datascience/ubi9-python-3.12/requirements.txt
🔇 Additional comments (6)
jupyter/trustyai/ubi9-python-3.12/requirements.txt (2)

2835-2850: safetensors 0.6.2 lacks prebuilt cp312 wheels for s390x – pip will fall back to a Rust build and fail without a Rust/cargo toolchain

Verified (piwheels.org, wheelodex.org): version 0.6.2 does not publish a manylinux_2_17_s390x wheel. If TrustyAI images ever target s390x, you must either skip installing safetensors on that arch or install Rust/cargo in the Dockerfile.

Possible fixes:

  • In jupyter/trustyai/ubi9-python-3.12/requirements.txt (~line 2835):
    -safetensors==0.6.2; python_version >= '3.9'
    +safetensors==0.6.2; python_version >= '3.9' and platform_machine != 's390x'
  • Or in the TrustyAI Dockerfile, install Rust/cargo only on s390x before pip install (pattern matching the pyarrow s390x builder):
    RUN if [ "$(uname -m)" = "s390x" ]; then \
          yum install -y rust cargo && \
        fi

Please confirm whether your CI/build matrix for TrustyAI includes s390x targets.


935-937: Confirm huggingface-hub 0.34.4 bump and verify HF ecosystem compatibility

The pin for huggingface-hub==0.34.4 (lines 935–937) in jupyter/trustyai/ubi9-python-3.12/requirements.txt aligns with our N-1 version-freeze policy, but downstream HF libraries can be sensitive to hub API changes. Please verify:

• Pipfile alignment
Check that jupyter/trustyai/ubi9-python-3.12/Pipfile is updated to huggingface-hub = "==0.34.4".
• Consistency across HF packages in this image
Ensure transformers, datasets, accelerate and diffusers remain compatible with 0.34.4.

Run:

#!/bin/bash
set -euo pipefail

echo "→ huggingface-hub in Python 3.12 TrustyAI Pipfile:"
rg -n 'huggingface-hub' -S jupyter/trustyai/ubi9-python-3.12/Pipfile

echo -e "\n→ HF ecosystem pins in Python 3.12 TrustyAI requirements:"
rg -n --no-heading -S '^(transformers|datasets|accelerate|diffusers)' \
  jupyter/trustyai/ubi9-python-3.12/requirements.txt
runtimes/datascience/ubi9-python-3.12/requirements.txt (4)

376-378: codeflare-sdk gating is appropriate

This package isn’t supported on s390x; keeping the marker aligns with PR goals.


1655-1663: py-spy gating is appropriate

py-spy ships platform-specific wheels/binaries and does not support s390x; gating is correct.


1664-1707: pyarrow gating aligns with s390x-builder workflow

Excluding here while building/installing via the dedicated s390x stage in the Dockerfile is the right pattern.


2093-2117: ray gating is appropriate

ray doesn’t support s390x; marker is correct.

Comment on lines +382 to 384
comm==0.2.3; platform_machine != 's390x' \
--hash=sha256:2dc8048c10962d55d7ad693be1e7045d891b7ce8d999c97963a5e3e99c055971 \
--hash=sha256:c615d91d75f7f04f095b30d1c1711babd43bdc6419c1be9886a85f2f4e489417
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Remove unnecessary s390x gating for pure-Python packages

These are universal wheels or pure-Python; gating them creates needless drift and breaks expected DS UX on s390x (e.g., ipywidgets). Drop the platform_machine != 's390x' markers and keep parity. Edit Pipfile accordingly and re-lock so this file regenerates cleanly.

Example diffs (apply at source in Pipfile, then re-lock; shown here for illustration):

-comm==0.2.3; platform_machine != 's390x' \
+comm==0.2.3 \
-google-auth==2.40.3; platform_machine != 's390x' \
+google-auth==2.40.3 \
-ipywidgets==8.1.2; platform_machine != 's390x' \
+ipywidgets==8.1.2 \
-jupyterlab-widgets==3.0.15; platform_machine != 's390x' \
+jupyterlab-widgets==3.0.15 \
-kubernetes==33.1.0; platform_machine != 's390x' \
+kubernetes==33.1.0 \
-markdown-it-py==3.0.0; platform_machine != 's390x' \
+markdown-it-py==3.0.0 \
-mdurl==0.1.2; platform_machine != 's390x' \
+mdurl==0.1.2 \
-oauthlib==3.3.1; platform_machine != 's390x' \
+oauthlib==3.3.1 \
-openshift-client==1.0.18; platform_machine != 's390x' \
+openshift-client==1.0.18 \
-pyasn1==0.6.1; platform_machine != 's390x' \
+pyasn1==0.6.1 \
-pyasn1-modules==0.4.2; platform_machine != 's390x' \
+pyasn1-modules==0.4.2 \
-pycparser==2.22; platform_machine != 's390x' \
+pycparser==2.22 \
-pydantic==1.10.22; platform_machine != 's390x' \
+pydantic==1.10.22 \
-requests-oauthlib==2.0.0; platform_machine != 's390x' \
+requests-oauthlib==2.0.0 \
-rich==13.9.4; platform_machine != 's390x' \
+rich==13.9.4 \
-rsa==4.9.1; platform_machine != 's390x' \
+rsa==4.9.1 \
-websocket-client==1.8.0; platform_machine != 's390x' \
+websocket-client==1.8.0 \
-widgetsnbextension==4.0.14; platform_machine != 's390x' \
+widgetsnbextension==4.0.14 \
-durationpy==0.10; platform_machine != 's390x' \
+durationpy==0.10 \
-executing==1.2.0; platform_machine != 's390x' \
+executing==1.2.0 \
-invoke==2.2.0; platform_machine != 's390x' \
+invoke==2.2.0 \

Also applies to: 697-699, 773-775, 803-805, 890-892, 893-895, 1017-1019, 1296-1298, 1340-1342, 1708-1710, 1711-1713, 1714-1716, 1760-1760, 2124-2126, 2127-2129, 2286-2288, 2434-2436, 2440-2442, 528-530, 534-536, 758-760

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/requirements.txt around lines 382 to
384, remove the platform_machine != 's390x' marker from the comm package line
since it is a pure-Python package and does not require architecture gating.
Then, update the Pipfile to remove the same gating for this and other similar
packages, and re-lock dependencies to regenerate this requirements.txt file
cleanly without unnecessary platform restrictions.

Comment on lines +773 to 775
ipywidgets==8.1.2; platform_machine != 's390x' \
--hash=sha256:bbe43850d79fb5e906b14801d6c01402857996864d1e5b6fa62dd2ee35559f60 \
--hash=sha256:d0b9b41e49bae926a866e613a39b0f0097745d2b9f1f3dd406641b4a57ec42c9
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Don’t gate ipywidgets/jupyterlab-widgets — breaks notebooks UX on s390x

These are pure-Python + front-end assets and expected in DS images. Gating makes s390x images materially different. Remove the markers.

Also applies to: 803-805

🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/requirements.txt around lines 773 to
775 and also lines 803 to 805, remove the platform_machine != 's390x' markers
gating ipywidgets and jupyterlab-widgets. These packages should be included
unconditionally to ensure consistent notebook UX across all architectures,
including s390x.

Comment on lines +1395 to 1397
paramiko==4.0.0; platform_machine != 's390x' \
--hash=sha256:0e20e00ac666503bf0b4eda3b6d833465a2b7aff2e2b3d79a8bba5ef144ee3b9 \
--hash=sha256:6a25f07b380cc9c9a88d2b920ad37167ac4667f8d9886ccebd8f90f654b5d69f
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Paramiko and its crypto deps: verify s390x wheel availability before gating

Paramiko depends on cryptography, bcrypt, and pynacl. If s390x wheels exist for these versions, remove the gates; otherwise keep. Let’s verify before deciding.

Also applies to: 142-142, 1871-1871, 458-458


🏁 Script executed:

#!/bin/bash
set -e
for spec in \
  "paramiko==4.0.0" \
  "cryptography==43.0.3" \
  "bcrypt==4.3.0" \
  "pynacl==1.5.0"
do
  name="${spec%%==*}"; ver="${spec##*==}"
  echo "== $name $ver =="
  curl -s "https://pypi.org/pypi/${name}/${ver}/json" \
   | jq -r '.urls[].filename' | sed 's/^/  /'
  echo "  -- s390x/universal --"
  curl -s "https://pypi.org/pypi/${name}/${ver}/json" \
   | jq -r '.urls[].filename | select(test("s390x") or test("py3-none-any"))' | sed 's/^/    /'
  echo
done

Length of output: 6995


Remove s390x gating for Paramiko only

Our verification shows that Paramiko 4.0.0 ships a pure‐Python “py3-none-any” wheel (which works on s390x), but its crypto deps do not:

  • Paramiko 4.0.0 → has py3-none-any.whl ⇒ safe on s390x → remove the platform_machine != 's390x' marker.
  • Cryptography 43.0.3 → no s390x/universal wheel → continue gating.
  • bcrypt 4.3.0 → no s390x/universal wheel → continue gating.
  • PyNaCl 1.5.0 → no s390x/universal wheel → continue gating.

Locations to update (and in the other two files with identical blocks):

In runtimes/datascience/ubi9-python-3.12/requirements.txt (lines 1395–1397):

- paramiko==4.0.0; platform_machine != 's390x' \
+ paramiko==4.0.0 \
     --hash=sha256:0e20e00ac666503bf0b4eda3b6d833465a2b7aff2e2b3d79a8bba5ef144ee3b9 \
     --hash=sha256:6a25f07b380cc9c9a88d2b920ad37167ac4667f8d9886ccebd8f90f654b5d69f

Keep the existing ; platform_machine != 's390x' on the cryptography==43.0.3, bcrypt==4.3.0, and PyNaCl==1.5.0 entries until dedicated wheels for s390x are available.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
paramiko==4.0.0; platform_machine != 's390x' \
--hash=sha256:0e20e00ac666503bf0b4eda3b6d833465a2b7aff2e2b3d79a8bba5ef144ee3b9 \
--hash=sha256:6a25f07b380cc9c9a88d2b920ad37167ac4667f8d9886ccebd8f90f654b5d69f
paramiko==4.0.0 \
--hash=sha256:0e20e00ac666503bf0b4eda3b6d833465a2b7aff2e2b3d79a8bba5ef144ee3b9 \
--hash=sha256:6a25f07b380cc9c9a88d2b920ad37167ac4667ccebd8f90f654b5d69f
🤖 Prompt for AI Agents
In runtimes/datascience/ubi9-python-3.12/requirements.txt at lines 1395 to 1397,
remove the platform_machine != 's390x' condition from the paramiko==4.0.0 entry
since its pure-Python wheel supports s390x. Keep the platform gating on
cryptography, bcrypt, and PyNaCl dependencies unchanged until s390x wheels are
available. Make the same update in the other two files with identical blocks as
noted.

@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Aug 8, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🔭 Outside diff range comments (1)
runtimes/datascience/ubi9-python-3.11/requirements.txt (1)

458-485: Upgrade cryptography – CVE-2024-12797 still open

cryptography==43.0.3 is flagged vulnerable (CVE-2024-12797). 44.0.1 is the first fixed release; please bump the pin (and regenerate hashes) instead of just excluding it for s390x.

♻️ Duplicate comments (3)
runtimes/datascience/ubi9-python-3.11/requirements.txt (3)

373-375: Click downgrade previously flagged – same concern applies.

See earlier review; restore click==8.2.2 unless a hard incompatibility exists.


896-902: Markdown rendering libs removed for s390x

Removing markdown-it-py/mdurl breaks JupyterLab markdown preview on s390x. Either keep the packages (build from source) or disable the feature.

Also applies to: 1020-1022


2391-2393: six still excluded on s390x

Many libs (e.g., google-api-core pre-3.x, some AWS SDK helpers) still import six. Keeping this exclusion risks runtime import errors.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c589816 and 5108080.

⛔ Files ignored due to path filters (3)
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • jupyter/trustyai/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
  • runtimes/datascience/ubi9-python-3.11/Pipfile.lock is excluded by !**/*.lock
📒 Files selected for processing (3)
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt (3 hunks)
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt (2 hunks)
  • runtimes/datascience/ubi9-python-3.11/requirements.txt (24 hunks)
🧰 Additional context used
🧠 Learnings (50)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:50-50
Timestamp: 2025-07-08T19:30:01.738Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in ROCm TensorFlow image during PR #1333 review. Issue #1346 was created with comprehensive problem description covering hardcoded x86_64 architecture breaking multi-arch support, detailed impact analysis, three solution options (runtime detection, BuildKit TARGETARCH integration, hybrid approach) with pros/cons analysis, comprehensive acceptance criteria covering core requirements and testing, phased implementation guidance, related files identification, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:38-38
Timestamp: 2025-07-08T19:30:20.513Z
Learning: jiridanek requested GitHub issue creation for multi-architecture support in TensorFlow CUDA runtime image during PR #1333 review. Issue was created with comprehensive problem description covering hardcoded NVARCH limitation, multiple solution options using TARGETARCH build argument with architecture mapping, acceptance criteria for multi-architecture builds, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py:619-626
Timestamp: 2025-07-08T19:33:14.340Z
Learning: jiridanek requested GitHub issue creation for Python 3.12 version check bug in bootstrapper.py during PR #1333 review. Issue #1348 was created with comprehensive problem description covering version check exclusion affecting all Python 3.12 runtime images, detailed impact analysis of bootstrapper execution failures, clear solution with code examples, affected files list including all 6 runtime bootstrapper copies, acceptance criteria for testing and verification, implementation notes about code duplication and upstream reporting, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:39:23.433Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. Issue #1373 was successfully created on the second attempt with comprehensive problem description covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt:2659-2680
Timestamp: 2025-08-07T12:41:48.997Z
Learning: For opendatahub-io/notebooks, rpds-py 0.27.0 provides manylinux wheels for Python 3.11 and 3.12 on x86_64 and aarch64, so no Rust build is needed for these platforms. For s390x and ppc64le, wheels are not available, so a Rust build stage or version pinning is required if those images are built.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:2466-2621
Timestamp: 2025-08-07T12:38:55.700Z
Learning: rpds-py version 0.27.0 provides manylinux wheels for Python 3.11 (cp311) and Python 3.12 (cp312) on PyPI, so it does not require a Rust build during image creation in opendatahub-io/notebooks.
📚 Learning: 2025-07-20T20:47:36.509Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-16T00:17:10.313Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: grdryn corrected CodeRabbit's false assessment about CUDA companion package wheel availability during PR #1396 review. The original analysis incorrectly checked all package releases instead of the specific versions that would be installed with tensorflow[and-cuda]~=2.19.0. The actual versions (nvidia-cudnn-cu12/9.3.0.75, nvidia-cuda-runtime-cu12/12.5.82, nvidia-cublas-cu12/12.5.3.2) do have aarch64 wheels available on PyPI, making the and-cuda extra compatible with arm64 builds.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-07T12:39:01.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/tensorflow/ubi9-python-3.12/requirements.txt:1531-1558
Timestamp: 2025-08-07T12:39:01.997Z
Learning: In opendatahub-io/notebooks, the ROCm TensorFlow Python 3.12 UBI9 image (runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm) was missing libxcrypt-compat, which is required for MySQL SASL2 plugin authentication with mysql-connector-python==9.3.0 on Python 3.12 UBI9. Issue #1722 was created to track this, following the established pattern for systematic dependency consistency and runtime compatibility across all Python 3.12 UBI9 images.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-07T12:41:48.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt:2659-2680
Timestamp: 2025-08-07T12:41:48.997Z
Learning: For opendatahub-io/notebooks, rpds-py 0.27.0 provides manylinux wheels for Python 3.11 and 3.12 on x86_64 and aarch64, so no Rust build is needed for these platforms. For s390x and ppc64le, wheels are not available, so a Rust build stage or version pinning is required if those images are built.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:28:15.791Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-11T11:54:28.202Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:54:28.202Z
Learning: In opendatahub-io/notebooks, Python 3.12-based images (e.g., runtime-cuda-pytorch-ubi9-python-3.12) may fail container runtime tests with "libcrypt.so.1 => not found" for MySQL SASL2 plugin libraries if `libxcrypt-compat` is missing. The solution is to install `libxcrypt-compat` in the Dockerfile.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-16T00:17:10.313Z
Learnt from: grdryn
PR: opendatahub-io/notebooks#1396
File: jupyter/tensorflow/ubi9-python-3.12/Pipfile:13-14
Timestamp: 2025-07-16T00:17:10.313Z
Learning: When checking PyPI wheel availability for aarch64 architecture, the actual wheel filenames use "manylinux2014_aarch64" pattern, not "linux_aarch64". This caused incorrect analysis in PR #1396 where CodeRabbit falsely claimed CUDA companion packages lacked aarch64 wheels when searching with the wrong pattern. The correct approach is to search for "manylinux2014_aarch64" or use a more generic "aarch64" pattern when verifying wheel availability for ARM64 compatibility.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-06T15:33:47.175Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:33:47.175Z
Learning: During PR #968 review, CodeRabbit initially incorrectly identified 1 legitimate micropipenv usage in jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda when analyzing Python 3.12 images for unused dependencies. Upon jiridanek's request for re-verification, comprehensive analysis revealed all 15 Python 3.12 Dockerfiles install micropipenv but none actually use it, making the cleanup scope 100% unnecessary installations with no exceptions to handle.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-05T17:24:08.616Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:28:15.791Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-09T08:12:05.822Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/minimal/ubi9-python-3.12/Pipfile:23-27
Timestamp: 2025-07-09T08:12:05.822Z
Learning: jiridanek corrected CodeRabbit's false positive about urllib3/requests version incompatibility in PR #1333 review. Investigation revealed that requests 2.32.3 actually requires `urllib3<3,>=1.21.1` (not `urllib3<2` as incorrectly stated), making urllib3 2.5.0 perfectly compatible. This demonstrates the importance of verifying technical assessments against actual build results and PyPI metadata before creating issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:26:17.140Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-24T12:01:45.188Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-28T14:15:41.168Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-23T16:18:42.922Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:34:51.825Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py:135-135
Timestamp: 2025-07-08T19:34:51.825Z
Learning: jiridanek requested GitHub issue creation for subprocess.call to subprocess.run migration during PR #1333 review. Investigation revealed broader scope affecting 12 bootstrapper.py files across both Python 3.11 and 3.12 runtime images plus CI test file. A comprehensive issue was created with detailed affected files analysis covering all runtime types (tensorflow, pytorch, rocm-tensorflow, rocm-pytorch, minimal, datascience), security and error handling concerns, multiple solution options, acceptance criteria, implementation guidance, and proper context linking, demonstrating the systematic code duplication pattern from Elyra project across the entire repository.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:09:48.746Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:35:49.482Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-04T05:52:49.464Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-01T09:33:23.033Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1521
File: tests/containers/workbenches/jupyterlab/jupyterlab_test.py:61-68
Timestamp: 2025-08-01T09:33:23.033Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection efficiency and centralization in JupyterLab tests during PR #1521 review. Issue #1576 was successfully created covering performance inefficiency of current container start/stop approach for architecture detection, need for centralized configuration of architecture-specific limitations, comprehensive solution options including session caching and helper functions, detailed acceptance criteria covering performance and maintainability requirements, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-08T14:48:51.662Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1782
File: manifests/overlays/additional/jupyter-trustyai-cpu-py312-ubi9-imagestream.yaml:28-28
Timestamp: 2025-08-08T14:48:51.662Z
Learning: In opendatahub-io/notebooks, imagestream manifests (e.g., manifests/*-imagestream.yaml) intentionally list dependency versions in opendatahub.io/notebook-python-dependencies as major.minor only (omit patch). Review suggestions should not ask to change these to exact patch versions; keep major.minor to match project convention.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-09T12:31:02.033Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T12:31:02.033Z
Learning: jiridanek requested GitHub issue creation for MSSQL repo file hardcoding problem during PR #1320 review. Issue #1363 was created and updated with comprehensive problem description covering hardcoded x86_64 MSSQL repo files breaking multi-architecture builds across 10 affected Dockerfiles (including datascience, CUDA, ROCm, and TrustyAI variants), detailed root cause analysis, three solution options with code examples, clear acceptance criteria for all image types, implementation guidance following established multi-architecture patterns, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-10T15:02:13.228Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: scripts/buildinputs/dockerfile.go:35-35
Timestamp: 2025-07-10T15:02:13.228Z
Learning: jiridanek requested GitHub issue creation for improving architecture detection in buildinputs tool during PR #1320 review. The automated issue creation failed despite appearing successful, so comprehensive issue content was provided for manual creation covering qemu-user s390x build failures where runtime.GOARCH returns host architecture instead of target architecture, detailed impact analysis, four solution options (environment variable detection, configuration parameter, platform detection logic, hybrid approach), clear acceptance criteria for multi-architecture support, phased implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-06T15:22:17.997Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm:17-18
Timestamp: 2025-08-06T15:22:17.997Z
Learning: jiridanek requested GitHub issue creation for micropipenv removal from Python 3.12 images during PR #968 review. Analysis revealed 15 affected Dockerfiles with only 1 legitimate usage (jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda using micropipenv install --dev) while 14 images install micropipenv unnecessarily. Issue #1685 was created with comprehensive problem description covering ~238MB total waste across unused installations, detailed scope analysis, multiple solution options, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:21:11.512Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf:3-4
Timestamp: 2025-07-08T19:21:11.512Z
Learning: jiridanek requested GitHub issue creation for PYTHONPATH configuration investigation in runtime images during PR #1333 review. Issue #1339 was created with comprehensive problem description covering pip.conf target directory configuration, missing PYTHONPATH exports, potential runtime import failures, multiple investigation areas (PYTHONPATH auditing, pip configuration consistency, runtime testing), solution options with implementation guidance, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-28T14:21:09.429Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-03T07:03:45.020Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-28T14:21:09.429Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-01T10:41:56.419Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-03T07:03:45.020Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-07T12:38:55.700Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:2466-2621
Timestamp: 2025-08-07T12:38:55.700Z
Learning: rpds-py version 0.27.0 provides manylinux wheels for Python 3.11 (cp311) and Python 3.12 (cp312) on PyPI, so it does not require a Rust build during image creation in opendatahub-io/notebooks.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-07T12:37:47.314Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1720
File: jupyter/trustyai/ubi9-python-3.12/requirements.txt:546-572
Timestamp: 2025-08-07T12:37:47.314Z
Learning: In the opendatahub-io/notebooks repository, strict alignment of debugpy versions across all images is not enforced as a policy. Version drift is tolerated and dependency versions are primarily managed by Pipenv locking. Alignment is only considered when inconsistencies are surfaced by tooling or cause issues.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-28T14:13:27.890Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-28T14:15:41.168Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-28T14:21:09.429Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-08T08:38:32.988Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1760
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/Dockerfile.cuda:241-244
Timestamp: 2025-08-08T08:38:32.988Z
Learning: Policy: In opendatahub-io/notebooks, do not remove micropipenv from Python 3.11 images to avoid breaking existing users. Cleanups can deduplicate duplicate installs and remove unused uv, but micropipenv must remain present in 3.11 images.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-01T10:41:56.419Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-08T14:57:20.636Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1782
File: manifests/overlays/additional/jupyter-trustyai-cpu-py312-ubi9-imagestream.yaml:1-1
Timestamp: 2025-08-08T14:57:20.636Z
Learning: In opendatahub-io/notebooks, when validating imagestream manifest dependency versions, compare manifest major.minor values against exact pinned versions in requirements.txt/Pipfile.lock for the same image; manifests intentionally omit patch numbers (project convention).

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-03T07:05:33.329Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-03T07:05:33.329Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-14T15:36:29.815Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-02T18:19:23.024Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-02T15:19:34.383Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1607
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml:2-2
Timestamp: 2025-08-02T15:19:34.383Z
Learning: The Red Hat OpenShift AI (RHOAI) supported configurations officially require OpenShift 4.14+ for all currently supported RHOAI versions (2.22, 2.19, 2.16), which corresponds to kubectl 1.27+. This means all supported RHOAI deployments will show Kustomize API deprecation warnings when using v1alpha1 API version, making the update to v1 essential rather than optional. The official support matrix is documented at https://access.redhat.com/articles/rhoai-supported-configs.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-16T11:32:09.203Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-06-26T15:28:35.416Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1197
File: runtimes/minimal/ubi9-python-3.11/requirements.txt:395-405
Timestamp: 2025-06-26T15:28:35.416Z
Learning: psutil version 7.x is compatible with UBI9, CentOS Stream 9, and RHEL 9 platforms in the opendatahub-io/notebooks repository. The upgrade from psutil 5.x to 7.x has been validated for these environments.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-22T14:31:20.315Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1463
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:15-15
Timestamp: 2025-07-22T14:31:20.315Z
Learning: PyTorch 2.6.0 wheels with CUDA 12.4 support (torch-2.6.0+cu124) are available on the official PyTorch CUDA index at https://download.pytorch.org/whl/cu124/ for Python versions 3.9, 3.10, 3.11, 3.12, and 3.13 on both Linux x86_64 and Windows AMD64 platforms.

Applied to files:

  • runtimes/datascience/ubi9-python-3.11/requirements.txt
  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-08-01T17:35:29.175Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-01T17:35:29.175Z
Learning: jiridanek requested GitHub issue creation for adding pytorch+llmcompressor images to Makefile build targets during PR #1519 review. Issue #1598 was successfully created with comprehensive problem description covering missing build targets for both jupyter workbench and runtime images, detailed solution with specific Makefile code examples following established patterns, thorough acceptance criteria covering individual targets, BASE_DIRS variable inclusion, and all-images target integration, implementation notes about handling '+' characters in paths, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-03T16:18:59.234Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".

Applied to files:

  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt
📚 Learning: 2025-07-08T19:09:48.746Z
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review. Issue was created with comprehensive problem description covering both Python 3.11 and 3.12 versions, repository pattern analysis showing correct vs incorrect naming, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Applied to files:

  • jupyter/trustyai/ubi9-python-3.11/requirements.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (7)
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.11, 3.11, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
🔇 Additional comments (4)
jupyter/trustyai/ubi9-python-3.11/requirements.txt (2)

920-922: Patch bump to huggingface-hub (0.34.4) looks good

Pure-Python, compatible with py311; hashes and marker are fine.


2794-2809: No s390x TrustyAI builds – safetensors bump is safe
We scanned our CI/manifests and found no jupyter-trustyai or TrustyAI image targeting the s390x platform. Since TrustyAI isn’t built for s390x today, the safetensors 0.6.2 update poses no immediate risk and requires no per-arch gating or Rust/Cargo provisioning.

jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt (2)

899-901: Patch bump to huggingface-hub (0.34.4) looks good

Patch-level upgrade with matching hashes; no platform-specific implications here.


4121-4123: No legacy OpenAI API usage detected

A search for openai.ChatCompletion, openai.Completion, openai.Image, from openai import OpenAI, and client = OpenAI( returned no matches in the repository. It appears the codebase is already using the v1 client API exclusively—no refactoring needed.

Comment on lines +2874 to +2889
safetensors==0.6.2; python_version >= '3.9' \
--hash=sha256:1d2d2b3ce1e2509c68932ca03ab8f20570920cd9754b05063d4368ee52833ecd \
--hash=sha256:43ff2aa0e6fa2dc3ea5524ac7ad93a9839256b8703761e76e2d0b2a3fa4f15d9 \
--hash=sha256:8045db2c872db8f4cbe3faa0495932d89c38c899c603f21e9b6486951a5ecb8f \
--hash=sha256:81e67e8bab9878bb568cffbc5f5e655adb38d2418351dc0859ccac158f753e19 \
--hash=sha256:89a89b505f335640f9120fac65ddeb83e40f1fd081cb8ed88b505bdccec8d0a1 \
--hash=sha256:93de35a18f46b0f5a6a1f9e26d91b442094f2df02e9fd7acf224cfec4238821a \
--hash=sha256:9c85ede8ec58f120bad982ec47746981e210492a6db876882aa021446af8ffba \
--hash=sha256:b0e4d029ab0a0e0e4fdf142b194514695b1d7d3735503ba700cf36d0fc7136ce \
--hash=sha256:c7b214870df923cbc1593c3faee16bec59ea462758699bd3fee399d00aac072c \
--hash=sha256:cab75ca7c064d3911411461151cb69380c9225798a20e712b102edda2542ddb1 \
--hash=sha256:d6675cf4b39c98dbd7d940598028f3742e0375a6b4d4277e76beb0c35f4b843b \
--hash=sha256:d83c20c12c2d2f465997c51b7ecb00e407e5f94d7dec3ea0cc11d86f60d3fde5 \
--hash=sha256:d944cea65fad0ead848b6ec2c37cc0b197194bec228f8020054742190e9312ac \
--hash=sha256:fa48268185c52bfe8771e46325a1e21d317207bcabcb72e65c6e28e9ffeb29c7 \
--hash=sha256:fc4d0d0b937e04bdf2ae6f70cd3ad51328635fe0e6214aa1fc811f3b576b3bda
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

safetensors 0.6.2: confirm s390x wheel availability or gate for s390x

If this image is ever built for s390x, safetensors may require Rust if wheels aren’t available. Consider gating on s390x to avoid unexpected source builds, or plan a Rust toolchain stage.

Would you verify s390x wheel availability for safetensors 0.6.2?

If gating is desired, adjust the marker:

-safetensors==0.6.2; python_version >= '3.9' \
+safetensors==0.6.2; python_version >= '3.9' and platform_machine != 's390x' \

🌐 Web query:

Do PyPI wheels for safetensors==0.6.2 include linux s390x for Python 3.11?

💡 Result:

As of August 8, 2025, the PyPI wheels for safetensors version 0.6.2 do not include a pre-built wheel for the linux s390x architecture with Python 3.11. The available wheels for this version are:

  • cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
  • cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
  • cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
  • cp311-cp311-manylinux_2_17_i686.manylinux1_i686.whl
  • cp311-cp311-macosx_11_0_arm64.whl
  • cp311-cp311-macosx_10_12_x86_64.whl

These wheels are compatible with various architectures, including x86_64, aarch64, armv7l, ppc64le, i686, and macOS platforms.

For the linux s390x architecture, the most recent available wheel is for safetensors version 0.5.3, which includes:

  • cp38-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl

This wheel is compatible with Python 3.8. (pypi.org)

Given the absence of a pre-built wheel for safetensors 0.6.2 on linux s390x with Python 3.11, you may need to build the package from source. Ensure that you have the necessary build tools and dependencies installed. Detailed instructions for building safetensors from source can be found in the project's documentation.

Citations:


safetensors 0.6.2 lacks linux/s390x wheel—gate or add Rust toolchain

safetensors 0.6.2 publishes wheels for x86_64, aarch64, armv7l, ppc64le, i686 and macOS, but no Python 3.11 manylinux wheel for s390x. On s390x this will fall back to source build (requiring Rust). To prevent unexpected Rust builds, either gate out s390x installs or ensure a Rust/Cargo stage is present.

Locations to address:

  • jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt (lines 2874–2889)

Suggested gating diff:

-safetensors==0.6.2; python_version >= '3.9' \
+safetensors==0.6.2; python_version >= '3.9' and platform_machine != 's390x' \

Alternatively, update your s390x image builds to install the Rust toolchain before pip.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
safetensors==0.6.2; python_version >= '3.9' \
--hash=sha256:1d2d2b3ce1e2509c68932ca03ab8f20570920cd9754b05063d4368ee52833ecd \
--hash=sha256:43ff2aa0e6fa2dc3ea5524ac7ad93a9839256b8703761e76e2d0b2a3fa4f15d9 \
--hash=sha256:8045db2c872db8f4cbe3faa0495932d89c38c899c603f21e9b6486951a5ecb8f \
--hash=sha256:81e67e8bab9878bb568cffbc5f5e655adb38d2418351dc0859ccac158f753e19 \
--hash=sha256:89a89b505f335640f9120fac65ddeb83e40f1fd081cb8ed88b505bdccec8d0a1 \
--hash=sha256:93de35a18f46b0f5a6a1f9e26d91b442094f2df02e9fd7acf224cfec4238821a \
--hash=sha256:9c85ede8ec58f120bad982ec47746981e210492a6db876882aa021446af8ffba \
--hash=sha256:b0e4d029ab0a0e0e4fdf142b194514695b1d7d3735503ba700cf36d0fc7136ce \
--hash=sha256:c7b214870df923cbc1593c3faee16bec59ea462758699bd3fee399d00aac072c \
--hash=sha256:cab75ca7c064d3911411461151cb69380c9225798a20e712b102edda2542ddb1 \
--hash=sha256:d6675cf4b39c98dbd7d940598028f3742e0375a6b4d4277e76beb0c35f4b843b \
--hash=sha256:d83c20c12c2d2f465997c51b7ecb00e407e5f94d7dec3ea0cc11d86f60d3fde5 \
--hash=sha256:d944cea65fad0ead848b6ec2c37cc0b197194bec228f8020054742190e9312ac \
--hash=sha256:fa48268185c52bfe8771e46325a1e21d317207bcabcb72e65c6e28e9ffeb29c7 \
--hash=sha256:fc4d0d0b937e04bdf2ae6f70cd3ad51328635fe0e6214aa1fc811f3b576b3bda
safetensors==0.6.2; python_version >= '3.9' and platform_machine != 's390x' \
--hash=sha256:1d2d2b3ce1e2509c68932ca03ab8f20570920cd9754b05063d4368ee52833ecd \
--hash=sha256:43ff2aa0e6fa2dc3ea5524ac7ad93a9839256b8703761e76e2d0b2a3fa4f15d9 \
--hash=sha256:8045db2c872db8f4cbe3faa0495932d89c38c899c603f21e9b6486951a5ecb8f \
--hash=sha256:81e67e8bab9878bb568cffbc5f5e655adb38d2418351dc0859ccac158f753e19 \
--hash=sha256:89a89b505f335640f9120fac65ddeb83e40f1fd081cb8ed88b505bdccec8d0a1 \
--hash=sha256:93de35a18f46b0f5a6a1f9e26d91b442094f2df02e9fd7acf224cfec4238821a \
--hash=sha256:9c85ede8ec58f120bad982ec47746981e210492a6db876882aa021446af8ffba \
--hash=sha256:b0e4d029ab0a0e0e4fdf142b194514695b1d7d3735503ba700cf36d0fc7136ce \
--hash=sha256:c7b214870df923cbc1593c3faee16bec59ea462758699bd3fee399d00aac072c \
--hash=sha256:cab75ca7c064d3911411461151cb69380c9225798a20e712b102edda2542ddb1 \
--hash=sha256:d6675cf4b39c98dbd7d940598028f3742e0375a6b4d4277e76beb0c35f4b843b \
--hash=sha256:d83c20c12c2d2f465997c51b7ecb00e407e5f94d7dec3ea0cc11d86f60d3fde5 \
--hash=sha256:d944cea65fad0ead848b6ec2c37cc0b197194bec228f8020054742190e9312ac \
--hash=sha256:fa48268185c52bfe8771e46325a1e21d317207bcabcb72e65c6e28e9ffeb29c7 \
--hash=sha256:fc4d0d0b937e04bdf2ae6f70cd3ad51328635fe0e6214aa1fc811f3b576b3bda
🤖 Prompt for AI Agents
In jupyter/pytorch+llmcompressor/ubi9-python-3.11/requirements.txt around lines
2874 to 2889, safetensors version 0.6.2 does not provide a prebuilt manylinux
wheel for the s390x architecture, causing pip to attempt a source build that
requires Rust. To fix this, add a platform-specific marker to exclude s390x from
installing safetensors or ensure the Rust toolchain is installed in the s390x
environment before pip installs. This gating prevents unexpected build failures
on s390x systems.

Copy link
Contributor

openshift-ci bot commented Aug 8, 2025

@Nash-123: The following tests failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
ci/prow/rocm-runtimes-ubi9-e2e-tests 1cdee32 link true /test rocm-runtimes-ubi9-e2e-tests
ci/prow/images 5108080 link true /test images
ci/prow/runtimes-ubi9-e2e-tests 5108080 link true /test runtimes-ubi9-e2e-tests
ci/prow/notebooks-py312-ubi9-e2e-tests 5108080 link true /test notebooks-py312-ubi9-e2e-tests

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants