Skip to content
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
71 changes: 71 additions & 0 deletions .github/workflows/uv-renewal.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
---
# This GitHub action is meant to update the pipfile.locks
name: uv.locks Renewal Action

on: # yamllint disable-line rule:truthy
# Triggers the workflow every Wednesday at 1am UTC
schedule:
- cron: "0 1 * * 3"
workflow_dispatch: # for manual trigger workflow from GH Web UI
inputs:
branch:
description: 'Specify branch'
required: false
default: 'main'
python_version:
description: 'Select Python version to update uv.lock'
required: false
default: '3.11'
type: choice
options:
- '3.12'
- '3.11'
update_optional_dirs:
description: 'Include optional directories in update'
required: false
default: 'false'
type: choice
options:
- 'true'
- 'false'

jobs:
build:
runs-on: ubuntu-latest
permissions:
contents: write
env:
BRANCH: ${{ github.event.inputs.branch || 'main' }}
PYTHON_VERSION: ${{ github.event.inputs.python_version || '3.11' }}
INCLUDE_OPT_DIRS: ${{ github.event.inputs.update_optional_dirs || 'false' }}
steps:
# Checkout the specified branch from the specified organization
- name: Checkout code from the specified branch
uses: actions/checkout@v4
with:
ref: ${{ env.BRANCH }}
token: ${{ secrets.GITHUB_TOKEN }}

# Configure Git
- name: Configure Git
run: |
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git config --global user.name "GitHub Actions"

# Setup Python environment with the specified version (or default to '3.11')
- name: Setup Python environment
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}

# Install uv
- name: Install uv
run: pip install uv

# Run makefile recipe to refresh uv.lock and push changes back to the branch
- name: Run make refresh-pipfilelock-files and push the changes back to the branch
run: |
uv lock --python ${{ env.PYTHON_VERSION }}
git add uv.lock
git commit -m "Update uv.lock files by uvlock-renewal.yaml action"
git push origin ${{ env.BRANCH }}
Copy link
Contributor

@coderabbitai coderabbitai bot Aug 26, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Guard commit/push when there are no changes

As written, git commit fails when uv.lock is unchanged, failing the workflow.

-      - name: Run make refresh-pipfilelock-files and push the changes back to the branch
+      - name: Refresh uv.lock and push changes (if any)
         run: |
           uv lock --python ${{ env.PYTHON_VERSION }}
           git add uv.lock
-          git commit -m "Update uv.lock files by uvlock-renewal.yaml action"
-          git push origin ${{ env.BRANCH }}
+          if git diff --staged --quiet; then
+            echo "uv.lock up to date; nothing to commit."
+          else
+            git commit -m "Update uv.lock by uv-renewal.yaml action (Python ${{ env.PYTHON_VERSION }})"
+            git push origin ${{ env.BRANCH }}
+          fi
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Run make refresh-pipfilelock-files and push the changes back to the branch
run: |
uv lock --python ${{ env.PYTHON_VERSION }}
git add uv.lock
git commit -m "Update uv.lock files by uvlock-renewal.yaml action"
git push origin ${{ env.BRANCH }}
- name: Refresh uv.lock and push changes (if any)
run: |
uv lock --python ${{ env.PYTHON_VERSION }}
git add uv.lock
if git diff --staged --quiet; then
echo "uv.lock up to date; nothing to commit."
else
git commit -m "Update uv.lock by uv-renewal.yaml action (Python ${{ env.PYTHON_VERSION }})"
git push origin ${{ env.BRANCH }}
fi
🤖 Prompt for AI Agents
.github/workflows/uv-renewal.yaml around lines 66-71: after running the uv lock
step, guard the git add/commit/push so the workflow does not fail when uv.lock
is unchanged; detect whether uv.lock was modified (e.g., check git
status/porcelain or run a git diff check) and only perform git add, git commit
and git push when a change is detected, otherwise skip those steps and exit
successfully.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we don't want to update uv.lock like this

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jiridanek, understood! I'll step back from that suggestion.

Could you clarify your preferred approach for handling uv.lock updates? Should this workflow be removed entirely, or do you have a different mechanism in mind for managing uv.lock files?


🧠 Learnings used
Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:17-18
Timestamp: 2025-08-01T16:10:21.049Z
Learning: jiridanek requested GitHub issue creation for version pinning improvement of micropipenv and uv packages during PR #968 review. Issue #1595 was created with comprehensive problem description covering build reproducibility risks from unpinned package versions across 18 affected Dockerfiles, two solution options (direct pinning vs build arguments), implementation approach with testing requirements, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:152-158
Timestamp: 2025-08-02T10:26:52.388Z
Learning: jiridanek corrected CodeRabbit's false assessment about uv pip install --strict flag support during PR #968 review. User demonstrated that uv 0.8.0 does support the --strict flag through command line verification, contradicting CodeRabbit's analysis using uv 0.8.4 that incorrectly suggested the flag was unsupported.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-05T17:24:08.616Z
Learning: jiridanek requested PR review for #1521 covering s390x architecture support improvements, demonstrating continued focus on systematic multi-architecture compatibility enhancements in the opendatahub-io/notebooks repository through clean implementation with centralized configuration, proper CI integration, and architecture-aware testing patterns.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1998
File: runtimes/pytorch/ubi9-python-3.11/Dockerfile.cuda:23-24
Timestamp: 2025-08-19T11:40:33.157Z
Learning: jiridanek corrected CodeRabbit's false assessment about uv pip install --build-constraints flag during PR #1998 review. User demonstrated that uv 0.8.12 uses the plural form --build-constraints (not --build-constraint as incorrectly suggested by CodeRabbit), verified through direct command line help output, showing the importance of accepting corrections when evidence contradicts initial analysis.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#968
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:152-158
Timestamp: 2025-08-02T10:26:52.388Z
Learning: CodeRabbit made an incorrect assessment about uv pip install --strict flag support during PR #968 review, suggesting its removal when the flag is actually supported in both uv 0.8.0 and 0.8.4. jiridanek correctly demonstrated the flag's availability through command line verification, showing the importance of thorough verification and accepting corrections when evidence contradicts initial analysis.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

1 change: 1 addition & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -477,6 +477,7 @@ refresh-pipfilelock-files:
echo "Regenerating requirements.txt files"
pushd $(ROOT_DIR)
bash $(ROOT_DIR)/scripts/sync-requirements-txt.sh
bash $(ROOT_DIR)/scripts/sync-pylock-toml.sh
popd
Comment on lines 477 to 481
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Good integration point; verify toolchain reproducibility

Hooking sync-pylock-toml.sh after requirements regeneration is sensible. Given the script auto-installs uv from PyPI, please confirm your CI runners permit networked tool bootstrapping and that the uv version is pinned to avoid churn in generated pylock.toml.

To verify determinism across lanes:


🏁 Script executed:

#!/bin/bash
# Expect identical pylock.toml across two uv versions on the same inputs (or document differences).
rg -n --glob 'jupyter/**/pylock.toml' || true

Length of output: 443


🏁 Script executed:

#!/bin/bash
# Locate and inspect the sync-pylock-toml.sh script for uv version pinning
rg -nF "sync-pylock-toml.sh" -l
sed -n '1,200p' scripts/sync-pylock-toml.sh

Length of output: 1190


Pin uv version in the bootstrap step and verify CI access

The sync-pylock-toml.sh script bootstraps uv with an unpinned pip install uv, which risks variability in the generated pylock.toml as new uv releases come out. To ensure reproducible builds:

• Confirm that your CI runners permit networked pip install uv during the sync step.
• In scripts/sync-pylock-toml.sh, pin the uv version in the install command. For example:

- uv --version || pip install uv
+ uv --version || pip install uv==<fixed-version>

Replace <fixed-version> with the vetted uv release used in your last successful run.
• (Optional) Document this pinned version in your Makefile or CI configuration, so future updates aren’t applied accidentally without review.

These changes will lock the toolchain and prevent unintended drift in your lockfiles.

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In Makefile around lines 477 to 481, the bootstrap step calls
scripts/sync-pylock-toml.sh which installs uv without pinning, causing
non-reproducible pylock.toml; update scripts/sync-pylock-toml.sh to replace the
unpinned pip install uv with a pinned version (use the last-vetted release as
<fixed-version>), ensure CI runners are allowed to perform networked pip
installs during this sync step (or add a job credential/mirror if not), and add
a short note in the Makefile or CI config documenting the pinned uv version so
future updates require explicit review.


# This is only for the workflow action
Expand Down
293 changes: 287 additions & 6 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,18 @@ name = "notebooks"
version = "2025.1"
description = "Open Data Hub / OpenShift AI Notebook / Workbench images, and tests for the same in Python."
readme = "README.md"
package-mode = false
requires-python = ">=3.12,<3.13"
requires-python = ">=3.11,<3.13"

# https://docs.astral.sh/uv/concepts/projects/dependencies/#managing-dependencies
dependencies = []
# WARNING: Do NOT attempt `uv lock --universal` (the default) resolution on this pyproject.toml.
# It would work (had the ` conflics ` section were defined), but it would run 10+ minutes and produce a huge `uv.lock` file (60MiB+).
# Instead, use the `scripts/sync-requirements-txt.sh` script which runs in seconds.

[dependency-groups]

############################
# Python Dependency Groups #
############################

dev = [
"pre-commit",
"pyright",
Expand All @@ -32,11 +37,287 @@ dev = [
"openshift-python-wrapper",
]

base = [
"wheel~=0.45.1",
"setuptools~=78.1.1",
]

jupyter-base = [
"jupyterlab==4.4.4",
"jupyter-server~=2.16.0",
"jupyter-server-proxy~=4.4.0",
"jupyter-server-terminals~=0.5.3",
"jupyterlab-git~=0.51.1",
"nbdime~=4.0.2",
"nbgitpuller~=1.2.2",
]

elyra-base = [
"odh-elyra==4.2.3",
"jupyterlab-lsp~=5.1.1",
"jupyterlab-widgets~=3.0.15",
"jupyter-resource-usage~=1.1.1",
]

elyra-preferred = [
"jupyter-bokeh~=4.0.5",
]

elyra-trustyai = [
"jupyter-bokeh~=3.0.5", # trustyai 0.6.1 depends on jupyter-bokeh~=3.0.5
]

db-connectors = [
"pymongo~=4.11.2",
"psycopg~=3.2.5",
"pyodbc~=5.2.0",
"mysql-connector-python~=9.3.0",
]

# onnxconverter-common ~=1.13.0 required for skl2onnx, as upgraded version is not compatible with protobuf
datascience-base = [
"boto3~=1.37.8",
"kafka-python-ng~=2.2.3",
"kfp~=2.12.1",
"plotly~=6.0.0",
"scipy~=1.15.2",
"skl2onnx~=1.18.0",
"onnxconverter-common~=1.13.0",
"kubeflow-training==1.9.0",
]

codeflare = [
"codeflare-sdk~=0.30.0",
]

datascience-preferred = [
"matplotlib~=3.10.1",
"numpy~=2.2.3",
"pandas~=2.2.3",
"scikit-learn~=1.6.1",
]

datascience-tensorflow = [
"matplotlib~=3.10.1",
"numpy~=1.26.4",
"pandas~=2.2.3",
"scikit-learn~=1.6.1",
]

datascience-trustyai = [
"matplotlib~=3.6.3",
"numpy~=1.24.1",
"pandas~=1.5.3",
"scikit-learn~=1.7.0"
]

tensorflowcuda= [
"tensorflow[and-cuda]~=2.18.0",
"tensorboard~=2.18.0",
"tf2onnx~=1.16.1",
]
tensorflowrocm = [
"tensorflow-rocm~=2.18.1",
"tensorboard~=2.18.0",
"tf2onnx~=1.16.1",
]
pytorchcuda = [
"tensorboard~=2.19.0",
"torch==2.6.0",
"torchvision==0.21.0",
]
pytorchrocm = [
"tensorboard~=2.18.0",
"torch==2.6.0",
"torchvision==0.21.0",
"pytorch-triton-rocm~=3.2.0",
]
llmcompressor = [
"vllm~=0.8.5",
"llmcompressor~=0.6.0",
"lm-eval~=0.4.8",
"loguru",
"pyyaml>=5.0.0",
"requests>=2.0.0",
"tqdm>=4.0.0",
"transformers>4.0,<5.0",
"datasets",
"accelerate>=0.20.3,!=1.1.0",
"pynvml",
"pillow",
"compressed-tensors",
]
trustyai = [
"torch==2.6.0",
"transformers~=4.53.0; python_version == '3.11'",
"transformers~=4.55.0; python_version == '3.12'",
"datasets~=3.4.1",
"accelerate~=1.5.2",
"trustyai~=0.6.1",
]

#########################
# Workbench Image Groups #
#########################

# https://docs.astral.sh/uv/concepts/projects/workspaces/#when-not-to-use-workspaces

jupyter-minimal-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
]

jupyter-datascience-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-preferred" },
{ include-group = "datascience-base" },
{ include-group = "datascience-preferred" },
{ include-group = "codeflare" },
{ include-group = "db-connectors" },
]

jupyter-tensorflow-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-preferred" },
{ include-group = "datascience-base" },
{ include-group = "datascience-tensorflow" },
{ include-group = "codeflare" },
{ include-group = "db-connectors" },
{ include-group = "tensorflowcuda" },
]

jupyter-tensorflow-rocm-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-preferred" },
{ include-group = "datascience-base" },
{ include-group = "datascience-tensorflow" },
{ include-group = "codeflare" },
{ include-group = "db-connectors" },
{ include-group = "tensorflowrocm" },
]

jupyter-pytorch-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-preferred" },
{ include-group = "datascience-base" },
{ include-group = "datascience-preferred" },
{ include-group = "codeflare" },
{ include-group = "db-connectors" },
{ include-group = "pytorchcuda" },
]

jupyter-pytorch-rocm-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-preferred" },
{ include-group = "datascience-base" },
{ include-group = "datascience-preferred" },
{ include-group = "codeflare" },
{ include-group = "db-connectors" },
{ include-group = "pytorchrocm" },
]

jupyter-pytorch-llmcompressor-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-preferred" },
{ include-group = "datascience-base" },
{ include-group = "datascience-tensorflow" },
{ include-group = "db-connectors" },
{ include-group = "llmcompressor" },
{ include-group = "pytorchcuda" },
]

jupyter-trustyai-image = [
{ include-group = "base" },
{ include-group = "jupyter-base" },
{ include-group = "elyra-base" },
{ include-group = "elyra-trustyai" },
{ include-group = "datascience-base" },
{ include-group = "datascience-trustyai" },
{ include-group = "codeflare" },
{ include-group = "db-connectors" },
{ include-group = "trustyai" },
{ include-group = "pytorchcuda" },
]

# https://docs.astral.sh/uv/concepts/projects/dependencies/#dependency-sources
[tool.uv.sources]

# NOTE: it is important to specify the `index` for the top-level groups, the ones used in the final resolution.
# Index values do not inherit from a lower-level group to the one where it is included.

# https://docs.astral.sh/uv/guides/integration/pytorch/#using-uv-with-pytorch
torch = [
{ index = "pytorch-cuda", group = "jupyter-pytorch-image" },
{ index = "pytorch-cuda", group = "jupyter-pytorch-llmcompressor-image" },
{ index = "pytorch-cuda", group = "jupyter-trustyai-image" },

{ index = "pytorch-rocm", group = "jupyter-pytorch-rocm-image" },
]
torchvision = [
{ index = "pytorch-cuda", group = "jupyter-pytorch-image" },
{ index = "pytorch-cuda", group = "jupyter-pytorch-llmcompressor-image" },
{ index = "pytorch-cuda", group = "jupyter-trustyai-image" },

{ index = "pytorch-rocm", group = "jupyter-pytorch-rocm-image" },
]
pytorch-triton-rocm = [
{ index = "pytorch-rocm", group = "jupyter-pytorch-rocm-image" },
]
tensorflow-rocm = [
{ url = "https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl", group = "jupyter-tensorflow-rocm-image" },
]

# https://docs.astral.sh/uv/concepts/indexes/#package-indexes
# TODO(jdanek): explicit = false, otherwise `uv pip compile --emit-index-url` wont emit it
# also see https://github.com/astral-sh/uv/issues/10008, https://github.com/astral-sh/uv/issues/15534
[[tool.uv.index]]
name = "pytorch-cuda"
url = "https://download.pytorch.org/whl/cu126"
explicit = true

[[tool.uv.index]]
name = "pytorch-rocm"
url = "https://download.pytorch.org/whl/rocm6.2.4"
explicit = true

[[tool.uv.index]]
name = "pypi"
url = "https://pypi.org/simple/"
explicit = true

[[tool.uv.dependency-metadata]]
name = "tf2onnx"
version = "1.16.1"
requires-dist = ["protobuf"]

[[tool.uv.dependency-metadata]]
name = "vllm"
version = "0.8.5"
requires-dist = ["compressed-tensors"]

[[tool.uv.dependency-metadata]]
name = "tensorflow-rocm"
version = "2.18.1"
requires-dist = []

[tool.uv]
package = false

# https://docs.astral.sh/uv/concepts/resolution/#platform-specific-resolution
environments = [
"sys_platform == 'darwin'",
"sys_platform == 'linux'",
"sys_platform == 'linux' and implementation_name == 'cpython'",
]

# https://github.com/astral-sh/uv/issues/3957#issuecomment-2659350181
Expand Down
Loading
Loading