Skip to content

Conversation

@jiridanek
Copy link
Member

@jiridanek jiridanek commented Dec 8, 2025

https://issues.redhat.com/browse/RHAIENG-2189

Description

This PR configures PIP_INDEX_URL, UV_INDEX_URL, and UV_DEFAULT_INDEX environment variables across notebook images to use PyPI (https://pypi.org/simple) as the primary package index, overriding the AIPCC RedHat Python index. This change represents AIPCC migration phase 1.5, ensuring that both pip and uv package managers use the public PyPI repository directly as their primary source when resolving dependencies.

The configuration is applied consistently across all Python 3.12 UBI9-based images:

  • Jupyter minimal notebooks
  • Code Server notebooks
  • Runtime images (datascience and minimal)

The environment variables are set using managed Dockerfile fragments with ### BEGIN/END markers for maintainability and consistency.

Side note: This PR includes preparatory infrastructure work in the first three commits:

  1. Introduction of a shared Python code directory (ntb/) for common utilities
  2. Implementation of doctest testing framework with initial test coverage
  3. Addition of subscription manager Dockerfile fragment for RHEL subscription handling

These commits establish the scripts/dockerfile_fragments.py infrastructure (inspired by AIPCC's replace-markers.sh) that manages reusable Dockerfile fragments across the repository, making the PIP_INDEX_URL configuration changes more maintainable and consistent.

How Has This Been Tested?

  • Verify that pip can install packages from PyPI using the configured primary index
  • Verify that uv can install packages from PyPI using the configured primary index
  • Confirm that the environment variables (PIP_INDEX_URL, UV_INDEX_URL, UV_DEFAULT_INDEX) are properly set in the built images
  • Run make test to ensure all tests pass

How Has This Been Tested?

looks reasonable; except some PRs need to go first to fix ci around this

Self checklist (all need to be checked):

  • Ensure that you have run make test (gmake on macOS) before asking for review
  • Changes to everything except Dockerfile.konflux files should be done in odh/notebooks and automatically synced to rhds/notebooks. For Konflux-specific changes, modify Dockerfile.konflux files directly in rhds/notebooks as these require special attention in the downstream repository and flow to the upcoming RHOAI release.

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Summary by CodeRabbit

  • New Features

    • Standardized Python package index environment variables added across images.
    • Exposed small utilities for template processing and inserting/updating file blocks.
    • Added an extra compatibility OS package to some images.
  • Tests

    • Added tests that verify package index environment variables are present in container images.
  • Chores

    • Updated CI/tooling to a newer UV setup and centralized Dockerfile fragment workflow.
    • Added migration-phase markers and deprecation notes.

✏️ Tip: You can customize this high-level summary in your review settings.

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 8, 2025

Skipping CI for Draft Pull Request.
If you want CI signal for your change, please convert it to an actual PR.
You can still manually trigger a test run with /test all

@github-actions github-actions bot added the review-requested GitHub Bot creates notification on #pr-review-ai-ide-team slack channel label Dec 8, 2025
@openshift-ci openshift-ci bot added the size/xl label Dec 8, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 8, 2025

Walkthrough

Adds RHAIENG-2189 BEGIN/END ENV blocks to many Dockerfiles introducing PIP_INDEX_URL, UV_INDEX_URL (deprecated comment) and UV_DEFAULT_INDEX; adds an ntb package (constants + string utilities), refactors scripts/dockerfile_fragments.py to a data-driven workflow with sanity_check, updates CI/workflows to setup-uv@v7 and uv invocations, and adjusts tests to assert the new env vars.

Changes

Cohort / File(s) Summary
Dockerfile ENV additions (codeserver)
codeserver/ubi9-python-3.12/Dockerfile.cpu
Added BEGIN/END RHAIENG-2189 migration markers and ENV declarations: PIP_INDEX_URL, UV_INDEX_URL (deprecation note), UV_DEFAULT_INDEX.
Dockerfile ENV additions (jupyter)
jupyter/.../ubi9-python-3.12/Dockerfile.*
jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu, jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu, jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda, jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm, jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda, jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda, jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm, jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm, jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda, jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu
Inserted BEGIN/END RHAIENG-2189 blocks in various stages, adding PIP_INDEX_URL, UV_INDEX_URL (with deprecation comment), and UV_DEFAULT_INDEX. Some files also added libxcrypt-compat to OS installs.
Dockerfile ENV additions (rstudio)
rstudio/.../ubi9-python-3.12/Dockerfile.*
rstudio/c9s-python-3.12/Dockerfile.cpu, rstudio/c9s-python-3.12/Dockerfile.cuda, rstudio/rhel9-python-3.12/Dockerfile.cpu, rstudio/rhel9-python-3.12/Dockerfile.cuda
Added BEGIN/END migration blocks defining PIP_INDEX_URL, UV_INDEX_URL (deprecation noted), and UV_DEFAULT_INDEX prior to relevant install steps.
Dockerfile ENV additions (runtimes)
runtimes/.../ubi9-python-3.12/Dockerfile.*
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu, runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu, runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda, runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda, runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm, runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm, runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
Inserted BEGIN/END RHAIENG-2189 blocks in runtime stages adding PIP_INDEX_URL, UV_INDEX_URL (deprecated comment), and UV_DEFAULT_INDEX.
New package: ntb
ntb/__init__.py, ntb/constants.py, ntb/strings.py
Added ntb package. constants.py defines ROOT_DIR. strings.py adds process_template_with_indents() and blockinfile() utilities with tests/examples. __init__ exposes selected symbols via __all__.
Script refactor: dockerfile_fragments.py
scripts/dockerfile_fragments.py
Refactored to a data-driven workflow: added docker_directories, centralized replacements mapping, introduced sanity_check() and a main loop applying block insertions; removed an internal blockinfile() implementation and updated tests to TestMain scaffolding.
Container tests & lifecycle
tests/containers/base_image_test.py
Added _test_container() context manager, updated _run_test() signature, and added test_python_package_index() which starts containers and asserts environment contains PIP_INDEX_URL, UV_INDEX_URL, and UV_DEFAULT_INDEX.
Build / tooling / CI
pyproject.toml, .github/workflows/*, ci/generate_code.sh
Enabled [tool.uv] package flag and added [tool.uv.build-backend]; extended ruff include to ntb/**/*.py; updated GitHub Actions to use astral-sh/setup-uv@v7 and removed pyproject-file inputs; replaced python scripts/dockerfile_fragments.py with uv run scripts/dockerfile_fragments.py.
Test runner config
pytest.ini
Converted addopts to multi-line, added --ignore=ci, --ignore=codeserver, --ignore=jupyter, --ignore=rstudio, added --doctest-modules, and set norecursedirs = .git.
Manifests update
manifests/base/commit-latest.env
Replaced multiple image commit-hash entries with updated hash values across base/workbench/runtime entries.
Workflows changes
.github/workflows/*.yaml
Replaced setup steps to use astral-sh/setup-uv@v7, removed pyproject-file inputs, and adjusted UV setup inputs/caching where applicable.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

  • Pay extra attention to:
    • ntb/strings.py — marker handling, prefix logic, newline preservation, and included tests/doctest behavior.
    • scripts/dockerfile_fragments.py — correctness of replacements mapping, sanity_check() coverage, idempotence of block insertion, and directory scanning/exclusion rules.
    • tests/containers/base_image_test.py — container lifecycle/context manager and robustness of environment parsing assertions.
    • Dockerfiles that added libxcrypt-compat — verify package list correctness and installation ordering.
    • Any Dockerfile lines where BEGIN/END markers were added without comment prefixes (may affect Dockerfile validity).

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
Description check ⚠️ Warning The pull request description is incomplete and does not adequately follow the required template structure. Complete all required sections from the template: add meaningful commit messages summary, provide comprehensive testing instructions covering all verification points, and clearly mark all merge criteria checkboxes with definitive x or unchecked status.
✅ Passed checks (1 passed)
Check name Status Explanation
Title check ✅ Passed The title 'RHAIENG-2189: configure PIP_INDEX_URL' directly references the ticket and the primary configuration change, accurately summarizing the main objective of the PR.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@jiridanek

This comment was marked as outdated.

@coderabbitai

This comment was marked as outdated.

@coderabbitai

This comment was marked as outdated.

@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 8, 2025
@jiridanek jiridanek added the tide/merge-method-rebase Denotes a PR that should be rebased by tide when it merges. label Dec 8, 2025
@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 8, 2025
@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 8, 2025
@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 9, 2025
@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 9, 2025
@jiridanek jiridanek marked this pull request as ready for review December 10, 2025 09:06
@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 11, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9553586 and 0e99dea.

📒 Files selected for processing (3)
  • .github/workflows/build-notebooks-TEMPLATE.yaml (2 hunks)
  • jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda (2 hunks)
  • jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (47)
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-c9s-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-c9s-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
  • GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
🔇 Additional comments (3)
.github/workflows/build-notebooks-TEMPLATE.yaml (2)

284-287: Clarify the intent of setting and then unsetting CFLAGS/CXXFLAGS.

Lines 285 set --env=CFLAGS=-Dundefined=64 --env=CXXFLAGS=-Dundefined=64 but then immediately unset both variables with --unsetenv=CFLAGS --unsetenv=CXXFLAGS. This appears contradictory—the variables are set only to be removed.

Can you clarify whether this is intentional (e.g., setting them for a specific build phase and then clearing them)? If so, consider adding a comment explaining the intended behavior. If unintentional, the unsetenv flags should likely be removed.


329-334: Verify astral-sh/setup-uv@v7 self-hosted runner compatibility.

The v7 upgrade includes a runtime change from Node 20 to Node 24. If this workflow runs on self-hosted runners, ensure they support Node 24. The workflow's cache configuration using enable-cache: true and cache-dependency-glob: "uv.lock" is compatible with v7 and requires no changes.

Other v7 breaking changes (removal of deprecated server-url input) do not affect this configuration.

jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda (1)

62-63: Consider alignment: --extra-index-url (secondary) vs. PIP_INDEX_URL (primary override).

The pip install here uses --extra-index-url https://pypi.org/simple to add PyPI as a fallback index. In contrast, lines 144–150 set PIP_INDEX_URL to override the primary index. These are two different strategies. Confirm this is intentional and that the early pip install doesn't need the primary-index override to align with the PR's global migration objective.

Comment on lines +144 to +150
### BEGIN RHAIENG-2189: this is AIPCC migration phase 1.5
ENV PIP_INDEX_URL=https://pypi.org/simple
# UV_INDEX_URL is deprecated in favor of UV_DEFAULT_INDEX
ENV UV_INDEX_URL=https://pypi.org/simple
# https://docs.astral.sh/uv/reference/environment/#uv_default_index
ENV UV_DEFAULT_INDEX=https://pypi.org/simple
### END RHAIENG-2189: this is AIPCC migration phase 1.5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash

# Get file size first
wc -l jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda

echo "=== Build stages (FROM ... AS statements) ==="
rg -n "^FROM.*AS" jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda

echo -e "\n=== All pip/uv invocations with context ==="
rg -n -B 2 "(pip install|uv pip)" jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda | head -100

echo -e "\n=== ENV var definitions (PIP_INDEX_URL, UV_INDEX_URL, UV_DEFAULT_INDEX) ==="
rg -n "ENV (PIP_INDEX_URL|UV_INDEX_URL|UV_DEFAULT_INDEX)" jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda

echo -e "\n=== Context around lines 62-63 (first pip install) ==="
sed -n '55,75p' jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda | cat -n

echo -e "\n=== Context around lines 144-150 (env vars) ==="
sed -n '135,155p' jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda | cat -n

Repository: opendatahub-io/notebooks

Length of output: 154


🏁 Script executed:

#!/bin/bash

# First, verify the file exists and check size
echo "=== Checking file ==="
if [ -f "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda" ]; then
  wc -l "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda"
  ls -lh "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda"
else
  echo "File not found. Searching for similar Dockerfiles..."
  find . -name "*pytorch*" -type f 2>/dev/null | head -20
fi

echo -e "\n=== Looking for FROM statements (stages) ==="
grep -n "^FROM" "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda" 2>/dev/null || echo "File not accessible"

echo -e "\n=== Looking for ENV statements with PIP_INDEX_URL or UV ==="
grep -n "ENV.*PIP_INDEX_URL\|ENV.*UV_INDEX_URL\|ENV.*UV_DEFAULT_INDEX" "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda" 2>/dev/null || echo "No matches found"

echo -e "\n=== Lines 55-75 (around line 62-63) ==="
sed -n '55,75p' "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda" 2>/dev/null | cat -n

echo -e "\n=== Lines 135-155 (around line 144-150) ==="
sed -n '135,155p' "jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda" 2>/dev/null | cat -n

Repository: opendatahub-io/notebooks

Length of output: 2979


🏁 Script executed:

#!/bin/bash

echo "=== All pip and uv invocations by line number ==="
grep -n "pip\|uv" jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda

echo -e "\n=== Check which stage each pip/uv invocation is in ==="
# Show stages and track which one we're in
awk '
/^FROM.*AS/ {
  current_stage = $NF
  stage_line = NR
  print "Line " NR ": Stage " current_stage " begins"
}
/pip|uv/ && !/^#/ {
  print "Line " NR ": " $0 " [in stage: " current_stage "]"
}
' jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda

Repository: opendatahub-io/notebooks

Length of output: 1880


🏁 Script executed:

#!/bin/bash

echo "=== Context around line 63 (pip install in cuda-base) ==="
sed -n '60,70p' jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda | cat -n

echo -e "\n=== Context around line 170 (uv pip install in cuda-jupyter-pytorch) ==="
sed -n '165,180p' jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda | cat -n

echo -e "\n=== Check ENV var order in cuda-jupyter-pytorch ==="
sed -n '137,175p' jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda | cat -n

Repository: opendatahub-io/notebooks

Length of output: 4696


ENV var placement is correct for the main package installation, but confirm if line 63's explicit flag should be aligned.

The ENV variables are appropriately positioned before the actual package installation at line 170 (uv pip install), which will inherit UV_DEFAULT_INDEX. However, there is an inconsistency in approach: line 63 uses an explicit --extra-index-url flag to install the uv tool itself in the cuda-base stage, while lines 145–149 define ENV variables for the cuda-jupyter-pytorch stage.

Since line 63 installs only the tools (micropipenv and uv), not the application packages, the --extra-index-url fallback may be intentional. The actual package installation at line 170 will correctly use the UV_DEFAULT_INDEX environment variable. Confirm whether this two-method approach is acceptable or if consistency is preferred (either use ENV vars everywhere or explicit flags everywhere).

🤖 Prompt for AI Agents
In jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda around lines
63 and 144-150, the review notes an inconsistency: line 63 installs uv with an
explicit --extra-index-url flag while lines 144-150 set
UV_DEFAULT_INDEX/UV_INDEX_URL ENV vars for later installs; decide and apply one
consistent approach across stages — either replace the explicit flag at line 63
with usage of the same ENV (set earlier or exported into the cuda-base stage) so
all installs inherit UV_DEFAULT_INDEX, or keep the explicit flag and add a
clarifying comment explaining why tooling installation must use the flag; update
the Dockerfile accordingly (move/export ENV into the base stage if using ENV, or
add comment and keep flag if intentional).

@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 11, 2025
@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 11, 2025

@jiridanek: The following tests failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
ci/prow/notebook-rocm-jupyter-pt-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-rocm-jupyter-pt-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-cuda-jupyter-tf-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-cuda-jupyter-tf-ubi9-python-3-12-pr-image-mirror
ci/prow/runtime-cuda-tf-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test runtime-cuda-tf-ubi9-python-3-12-pr-image-mirror
ci/prow/runtime-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test runtime-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-jupyter-ds-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-jupyter-ds-ubi9-python-3-12-pr-image-mirror
ci/prow/rocm-runtime-pt-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test rocm-runtime-pt-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-jupyter-tai-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-jupyter-tai-ubi9-python-3-12-pr-image-mirror
ci/prow/runtime-cuda-pt-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test runtime-cuda-pt-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-cuda-jupyter-pt-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-cuda-jupyter-pt-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-jupyter-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/codeserver-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test codeserver-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-rocm-jupyter-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-rocm-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebooks-py312-ubi9-e2e-tests 950ca5c link true /test notebooks-py312-ubi9-e2e-tests
ci/prow/rocm-notebooks-py312-e2e-tests 950ca5c link true /test rocm-notebooks-py312-e2e-tests
ci/prow/notebook-cuda-jupyter-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test notebook-cuda-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/images 950ca5c link true /test images
ci/prow/runtime-ds-ubi9-python-3-12-pr-image-mirror 950ca5c link true /test runtime-ds-ubi9-python-3-12-pr-image-mirror

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 11, 2025
Copy link
Member

@atheo89 atheo89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 11, 2025

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: atheo89

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

jiridanek and others added 10 commits December 11, 2025 13:34
There is a Jira from the past about this

* https://issues.redhat.com/browse/RHOAIENG-28824
* https://issues.redhat.com/browse/RHAIENG-1284

```
SUBFAILED[dlib='/opt/app-root/lib64/python3.12/site-packages/mysql/vendor/private/sasl2/libplain.so.3.0.0'] tests/containers/base_image_test.py::TestBaseImage::test_elf_files_can_link_runtime_libs[ghcr.io/opendatahub-io/notebooks/workbench-images:rocm-jupyter-tensorflow-ubi9-python-3.12-_8ac1b04d1f22ec3c5bc4a50a95176a522bee132d] - Failed: dlib='/opt/app-root/lib64/python3.12/site-packages/mysql/vendor/private/sasl2/libplain.so.3.0.0' has unsatisfied dependencies deps='libcrypt.so.1 => not found'
```
…ilation failure

```
/dev/mapper/buildvg-buildlv   68G   14G   52G  21% /home/runner/.local/share/containers
  × Failed to build `pyzmq==27.1.0`
  ├─▶ The build backend returned an error
  ╰─▶ Call to `scikit_build_core.build.build_wheel` failed (exit status: 1)
...
      /tmp/tmpy8t962b7/build/_deps/bundled_libzmq-build/platform.hpp:22:28:
      error: ‘undefined’ was not declared in this scope
         22 | #define ZMQ_CACHELINE_SIZE undefined
            |                            ^~~~~~~~~
```
@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 11, 2025

New changes are detected. LGTM label has been removed.

@openshift-ci openshift-ci bot added size/xl and removed size/xl labels Dec 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved review-requested GitHub Bot creates notification on #pr-review-ai-ide-team slack channel size/xl tide/merge-method-rebase Denotes a PR that should be rebased by tide when it merges.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants