RHAIENG-2860: Consolidate dependencies in artifact input #3046
RHAIENG-2860: Consolidate dependencies in artifact input #3046ysok wants to merge 1 commit intoopendatahub-io:mainfrom
Conversation
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here. DetailsNeeds approval from an approver in each of these files:Approvers can indicate their approval by writing |
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughThe changes refactor Node.js and ripgrep handling in the code-server build pipeline by switching from hermetic versions to system packages, consolidating ripgrep to a single version with architecture-specific patching, introducing custom npm dependency prefetching configuration, and expanding Tekton pipeline resource allocations. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes 🚥 Pre-merge checks | ✅ 3 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 7
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
@.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml:
- Around line 175-177: Update the inline comment that currently reads "same URLs
as artifacts.in.yaml" to reference the new source location
"custom-packages/package.json" instead; locate the comment adjacent to the npm
entry for
"codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages"
and replace the mention of artifacts.in.yaml with a clear note that the GitHub
tarball deps are now defined in custom-packages/package.json.
In `@codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yaml`:
- Around line 38-40: The ripgrep v13.0.0-13 artifact entries have a libc
mismatch: the download_url targets unknown-linux-gnu but the filename fields use
unknown-linux-musl; update the filename values for the ripgrep v13.0.0-13
entries (currently named ripgrep-v13.0.0-13-aarch64-unknown-linux-musl.tar.gz,
ripgrep-v13.0.0-13-powerpc64le-unknown-linux-musl.tar.gz,
ripgrep-v13.0.0-13-s390x-unknown-linux-musl.tar.gz) to match the gnu URLs
(change musl → gnu) OR alternatively update the corresponding download_url
entries to point to the musl releases so download_url and filename are
consistent.
In `@Makefile`:
- Around line 98-100: The echo message in the Makefile preflight check contains
a typographic em dash (—) that can be mis-parsed if copied into the suggested
command; update the string to use an ASCII separator (e.g. double hyphen "--" or
single hyphen "-") in the hint so the suggested invocation of
scripts/lockfile-generators/prefetch-all.sh --component-dir $(patsubst
%/,%,$(BUILD_DIR)) parses correctly; change the message text around BUILD_DIR
and the script path to replace the em dash with the chosen ASCII separator.
In `@scripts/lockfile-generators/download-npm.sh`:
- Around line 199-201: The script currently exits early when npm_paths is empty
(echo ...; exit 0), which prevents running the --lock-file path; change the
logic so the script only exits when npm_paths is empty AND no --lock-file option
was supplied. Concretely, in the block checking npm_paths (the variables
npm_paths and TEKTON_FILE), replace the unconditional exit with a conditional
that checks whether the --lock-file flag/argument (the variable your parser
stores for that flag) is empty; if --lock-file is present, skip the download
step but continue to the lock-file generation flow instead of exiting.
In `@scripts/lockfile-generators/README.md`:
- Around line 754-756: The script uses a relative "cd ../.." which only moves to
codeserver/ubi9-python-3.12 instead of the repo root, so the subsequent "git add
-A ." may miss files like .gitmodules; change the directory change to ensure you
move to the repository root before running "git add -A ." (e.g., replace the "cd
../.." step with a deterministic repo-root change) so that the "git add -A ."
and following "git commit -m" operate from the repository root and stage
.gitmodules as intended; update the lines containing the "cd ../.." command and
the following "git add -A ." / "git commit -m" sequence.
In `@scripts/lockfile-generators/rewrite-npm-urls.sh`:
- Around line 113-119: The replacement regexes that construct cache filenames
(the three perl -i -pe lines updating codeload.tar.gz, git+ssh|https resolved,
and GitHub shortname refs) currently insert raw ref captures ($3/$4) into
filenames; update each substitution so the captured ref is sanitized (e.g.,
replace "/" with a safe character like "-" or URL-encode "%2F") before embedding
into the "${CACHI2_BASE}/...-...-<ref>.tgz" path, ensuring refs with slashes do
not create nested path segments and will match the prefetched artifact layout.
In `@tests/containers/workbenches/jupyterlab/jupyterlab_datascience_test.py`:
- Line 87: Fix the skip message on the pytest.skip call so it reads clearly and
has no stray trailing quote: update the message produced by the pytest.skip in
the test (the line using pytest.skip and referencing datascience_image.name and
datascience_image.labels['name']) to something like "Image
{datascience_image.name} does not have '-rstudio-' in
{datascience_image.labels['name']}" (remove the extra trailing quote and change
"does have" to "does not have").
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge base: Disabled due to data retention organization setting
⛔ Files ignored due to path filters (1)
codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (18)
.github/workflows/build-notebooks-TEMPLATE.yaml.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yamlMakefilecodeserver/ubi9-python-3.12/Dockerfile.cpucodeserver/ubi9-python-3.12/Dockerfile.konflux.cpucodeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yamlcodeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yamlcodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/README.mdcodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/package.jsoncodeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.in.yamlcodeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yamlcodeserver/ubi9-python-3.12/uv.lock.d/pylock.cpu.tomlscripts/lockfile-generators/README.mdscripts/lockfile-generators/download-npm.shscripts/lockfile-generators/prefetch-all.shscripts/lockfile-generators/rewrite-npm-urls.shtests/containers/workbenches/jupyterlab/jupyterlab_datascience_test.py
💤 Files with no reviewable changes (5)
- codeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yaml
- codeserver/ubi9-python-3.12/Dockerfile.konflux.cpu
- codeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.in.yaml
- codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yaml
- codeserver/ubi9-python-3.12/Dockerfile.cpu
.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml
Outdated
Show resolved
Hide resolved
tests/containers/workbenches/jupyterlab/jupyterlab_datascience_test.py
Outdated
Show resolved
Hide resolved
94f89bc to
f31ee29
Compare
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
codeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yaml (1)
53-61:⚠️ Potential issue | 🟡 MinorAlign the aarch64 ripgrep filename with its GNU URL.
Line 53 specifies
aarch64-unknown-linux-gnu.tar.gzin the download URL, but line 55 hasaarch64-unknown-linux-musl.tar.gzas the filename. This mismatch can cause prefetch/packaging tools to fail or fetch the wrong artifact. The powerpc64le and s390x entries are already correct.Proposed fix
- download_url: https://github.com/microsoft/ripgrep-prebuilt/releases/download/v13.0.0-13/ripgrep-v13.0.0-13-aarch64-unknown-linux-gnu.tar.gz checksum: sha256:1b0ca509f8707f2128f1b3ef245c3ea666d49a737431288536d49bd74652d143 - filename: ripgrep-v13.0.0-13-aarch64-unknown-linux-musl.tar.gz + filename: ripgrep-v13.0.0-13-aarch64-unknown-linux-gnu.tar.gz🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@codeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yaml` around lines 53 - 61, The aarch64 artifact entry has a filename mismatch: the download_url uses "aarch64-unknown-linux-gnu.tar.gz" but the filename field is "aarch64-unknown-linux-musl.tar.gz"; update the filename in the artifact entry to "ripgrep-v13.0.0-13-aarch64-unknown-linux-gnu.tar.gz" to match the download_url (and re-verify the checksum for the corrected file if needed) so the prefetch/packaging tools find the correct artifact.
🧹 Nitpick comments (1)
codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yaml (1)
33-36: Consider adding an inline intent note for GNU filenames on the remaining arches.Line 34 and Line 36 changed naming behavior, but unlike Line 30 there’s no local rationale. A short comment would reduce future regression risk.
Suggested diff
- url: https://github.com/microsoft/ripgrep-prebuilt/releases/download/v13.0.0-13/ripgrep-v13.0.0-13-aarch64-unknown-linux-gnu.tar.gz filename: ripgrep-v13.0.0-13-aarch64-unknown-linux-musl.tar.gz + # ppc64le/s390x intentionally use GNU triplet filenames. - url: https://github.com/microsoft/ripgrep-prebuilt/releases/download/v13.0.0-13/ripgrep-v13.0.0-13-powerpc64le-unknown-linux-gnu.tar.gz filename: ripgrep-v13.0.0-13-powerpc64le-unknown-linux-gnu.tar.gz - url: https://github.com/microsoft/ripgrep-prebuilt/releases/download/v13.0.0-13/ripgrep-v13.0.0-13-s390x-unknown-linux-gnu.tar.gz filename: ripgrep-v13.0.0-13-s390x-unknown-linux-gnu.tar.gz🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yaml` around lines 33 - 36, Add a short inline comment in artifacts.in.yaml near the ripgrep entries explaining the intentional GNU-style filename change for the powerpc64le and s390x artifacts (the entries with filename ripgrep-v13.0.0-13-powerpc64le-unknown-linux-gnu.tar.gz and ripgrep-v13.0.0-13-s390x-unknown-linux-gnu.tar.gz) so future maintainers know this naming differs from other arches (as noted on the earlier entry around Line 30); place the comment directly above those two URL/filename pairs and state the rationale (compatibility with GNU toolchains / upstream naming) and that it is deliberate to avoid accidental reversion.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@codeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yaml`:
- Around line 53-61: The aarch64 artifact entry has a filename mismatch: the
download_url uses "aarch64-unknown-linux-gnu.tar.gz" but the filename field is
"aarch64-unknown-linux-musl.tar.gz"; update the filename in the artifact entry
to "ripgrep-v13.0.0-13-aarch64-unknown-linux-gnu.tar.gz" to match the
download_url (and re-verify the checksum for the corrected file if needed) so
the prefetch/packaging tools find the correct artifact.
---
Nitpick comments:
In `@codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yaml`:
- Around line 33-36: Add a short inline comment in artifacts.in.yaml near the
ripgrep entries explaining the intentional GNU-style filename change for the
powerpc64le and s390x artifacts (the entries with filename
ripgrep-v13.0.0-13-powerpc64le-unknown-linux-gnu.tar.gz and
ripgrep-v13.0.0-13-s390x-unknown-linux-gnu.tar.gz) so future maintainers know
this naming differs from other arches (as noted on the earlier entry around Line
30); place the comment directly above those two URL/filename pairs and state the
rationale (compatibility with GNU toolchains / upstream naming) and that it is
deliberate to avoid accidental reversion.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge base: Disabled due to data retention organization setting
⛔ Files ignored due to path filters (1)
codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (15)
.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yamlcodeserver/ubi9-python-3.12/Dockerfile.cpucodeserver/ubi9-python-3.12/Dockerfile.konflux.cpucodeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yamlcodeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yamlcodeserver/ubi9-python-3.12/prefetch-input/patches/apply-patch.shcodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/README.mdcodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/package.jsoncodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/ripgrep/VERIFICATION.mdcodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/ripgrep/postinstall.jscodeserver/ubi9-python-3.12/prefetch-input/patches/setup-offline-binaries.shcodeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.in.yamlcodeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yamlscripts/lockfile-generators/rewrite-npm-urls.sh
💤 Files with no reviewable changes (2)
- codeserver/ubi9-python-3.12/Dockerfile.cpu
- codeserver/ubi9-python-3.12/Dockerfile.konflux.cpu
🚧 Files skipped from review as they are similar to previous changes (2)
- codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/custom-packages/package.json
- .tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml
|
/kfbuild codeserver/ubi9-python-3.12 |
5aad4a1 to
0bcaaa1
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (2)
.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml (1)
175-177:⚠️ Potential issue | 🟡 MinorUpdate stale source-of-truth comment for tarball deps.
Line 175 still says these URLs are in
artifacts.in.yaml, but this PR moves them tocustom-packages/package.json.✏️ Proposed wording fix
- # GitHub tarball deps (git+ssh not available in Konflux); same URLs as artifacts.in.yaml + # GitHub tarball deps (git+ssh not available in Konflux); URLs are pinned in custom-packages/package.json🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml around lines 175 - 177, Update the stale comment "GitHub tarball deps (git+ssh not available in Konflux); same URLs as artifacts.in.yaml" to reflect the new source-of-truth by referencing "custom-packages/package.json" instead of "artifacts.in.yaml" so the comment above the tarball deps block (the line beginning with GitHub tarball deps) accurately describes where the URLs now live.codeserver/ubi9-python-3.12/prefetch-input/patches/setup-offline-binaries.sh (1)
66-73:⚠️ Potential issue | 🟠 MajorAdd strict Node target/version guard before caching system Node.
Lines 66-73 cache
/usr/bin/nodewithout verifying it matches the VSCode target version. That can silently break hermeticity when system Node drifts.🔧 Proposed fix
NODE_BUILD_VERSION=$(grep -E '^target=' "${CODESERVER_SOURCE_PREFETCH}/lib/vscode/remote/.npmrc" | cut -d'"' -f2) -NODE_ARCH=$(node -p "process.arch") +if [[ -z "${NODE_BUILD_VERSION}" ]]; then + echo "ERROR: Failed to resolve VSCode Node target version from remote/.npmrc" + exit 1 +fi +if [[ ! -x /usr/bin/node ]]; then + echo "ERROR: /usr/bin/node is missing or not executable" + exit 1 +fi +NODE_ARCH=$(/usr/bin/node -p "process.arch") +SYSTEM_NODE_VERSION=$(/usr/bin/node -p "process.versions.node") +if [[ "${SYSTEM_NODE_VERSION}" != "${NODE_BUILD_VERSION}" ]]; then + echo "ERROR: System node (${SYSTEM_NODE_VERSION}) != VSCode target (${NODE_BUILD_VERSION})" + exit 1 +fi VSCODE_BUILD_DIR="${CODESERVER_SOURCE_PREFETCH}/lib/vscode/.build" NODE_CACHE_DIR="${VSCODE_BUILD_DIR}/node/v${NODE_BUILD_VERSION}/linux-${NODE_ARCH}" mkdir -p "${NODE_CACHE_DIR}" cp /usr/bin/node "${NODE_CACHE_DIR}/node"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@codeserver/ubi9-python-3.12/prefetch-input/patches/setup-offline-binaries.sh` around lines 66 - 73, Validate the system Node version before caching: ensure /usr/bin/node exists, read the runtime version from system node (e.g. via node -p "process.versions.node" or node -v normalized) and compare it to the expected NODE_BUILD_VERSION variable; if it does not match, emit an error message and exit non‑zero instead of copying, otherwise proceed to create NODE_CACHE_DIR and copy/chmod as currently done (references: NODE_BUILD_VERSION, NODE_ARCH, VSCODE_BUILD_DIR, NODE_CACHE_DIR and the block that copies /usr/bin/node).
🧹 Nitpick comments (1)
codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/ci/build/build-vscode.sh (1)
76-127: Clean up temporaryproduct.original.jsonfile.The script creates
product.original.jsonat line 77 but never removes it. This leaves an orphan file in the repository after a successful build. Additionally, if the script fails between lines 77-127 due toset -e, both files could be left in an inconsistent state.♻️ Proposed fix using a trap for reliable cleanup
main() { cd "$(dirname "${0}")/../.." source ./ci/lib.sh + + # Ensure cleanup on exit (success or failure) + cleanup() { + if [[ -f lib/vscode/product.original.json ]]; then + rm -f lib/vscode/product.original.json + fi + } + trap cleanup EXIT # Build for current arch (x64, arm64, ppc64, s390x) so system Node can be used.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/ci/build/build-vscode.sh` around lines 76 - 127, Create a trap to always remove the temporary product.original.json after it's created so it can't be left behind on success or failure: immediately after the cp product.json product.original.json line, add a trap like trap 'rm -f product.original.json' EXIT (or EXIT/ERR) so the file is cleaned on script exit; keep the existing jq ... ) > product.json and final git checkout product.json unchanged so the trap simply removes the temporary product.original.json regardless of how the script terminates.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yaml`:
- Line 6: The checksum entry for the artifact `RPM-GPG-KEY-EPEL-9` in the
`checksum` field is the empty-content SHA256 and must be replaced with the real
digest: re-download the actual RPM-GPG-KEY-EPEL-9 file, compute its sha256
(e.g., via sha256sum or an equivalent tool), and update the `checksum` value
(the `checksum: sha256:...` line) with the computed hash so hermetic prefetch
integrity verification passes.
---
Duplicate comments:
In
@.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml:
- Around line 175-177: Update the stale comment "GitHub tarball deps (git+ssh
not available in Konflux); same URLs as artifacts.in.yaml" to reflect the new
source-of-truth by referencing "custom-packages/package.json" instead of
"artifacts.in.yaml" so the comment above the tarball deps block (the line
beginning with GitHub tarball deps) accurately describes where the URLs now
live.
In
`@codeserver/ubi9-python-3.12/prefetch-input/patches/setup-offline-binaries.sh`:
- Around line 66-73: Validate the system Node version before caching: ensure
/usr/bin/node exists, read the runtime version from system node (e.g. via node
-p "process.versions.node" or node -v normalized) and compare it to the expected
NODE_BUILD_VERSION variable; if it does not match, emit an error message and
exit non‑zero instead of copying, otherwise proceed to create NODE_CACHE_DIR and
copy/chmod as currently done (references: NODE_BUILD_VERSION, NODE_ARCH,
VSCODE_BUILD_DIR, NODE_CACHE_DIR and the block that copies /usr/bin/node).
---
Nitpick comments:
In
`@codeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/ci/build/build-vscode.sh`:
- Around line 76-127: Create a trap to always remove the temporary
product.original.json after it's created so it can't be left behind on success
or failure: immediately after the cp product.json product.original.json line,
add a trap like trap 'rm -f product.original.json' EXIT (or EXIT/ERR) so the
file is cleaned on script exit; keep the existing jq ... ) > product.json and
final git checkout product.json unchanged so the trap simply removes the
temporary product.original.json regardless of how the script terminates.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge base: Disabled due to data retention organization setting
📒 Files selected for processing (11)
.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-pull-request.yaml.tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yamlcodeserver/ubi9-python-3.12/Dockerfile.cpucodeserver/ubi9-python-3.12/Dockerfile.konflux.cpucodeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.in.yamlcodeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yamlcodeserver/ubi9-python-3.12/prefetch-input/patches/code-server-v4.106.3/ci/build/build-vscode.shcodeserver/ubi9-python-3.12/prefetch-input/patches/setup-offline-binaries.shcodeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.in.yamlcodeserver/ubi9-python-3.12/prefetch-input/rhds/artifacts.lock.yamldocs/konflux.md
🚧 Files skipped from review as they are similar to previous changes (2)
- .tekton/odh-workbench-codeserver-datascience-cpu-py312-ubi9-push.yaml
- codeserver/ubi9-python-3.12/Dockerfile.konflux.cpu
| artifacts: | ||
| - download_url: https://dl.fedoraproject.org/pub/epel/RPM-GPG-KEY-EPEL-9 | ||
| checksum: sha256:fcf0eab4f05a1c0de6363ac4b707600a27a9d774e9b491059e59e6921b255a84 | ||
| checksum: sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's locate and examine the file
fd "artifacts.lock.yaml" -p "ubi9-python-3.12"Repository: opendatahub-io/notebooks
Length of output: 171
🏁 Script executed:
# Check the content around line 6 of the artifacts.lock.yaml file
find . -path "*/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yaml" -exec cat -n {} \; | head -20Repository: opendatahub-io/notebooks
Length of output: 1437
🏁 Script executed:
# Let's also verify the empty-content SHA256 hash claim
python3 <<'EOF'
import hashlib
empty_hash = hashlib.sha256(b'').hexdigest()
print(f"SHA256 of empty content: {empty_hash}")
EOFRepository: opendatahub-io/notebooks
Length of output: 157
🏁 Script executed:
#!/bin/bash
set -euo pipefail
url="https://dl.fedoraproject.org/pub/epel/RPM-GPG-KEY-EPEL-9"
empty_sha="e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
tmp="$(mktemp)"
trap 'rm -f "$tmp"' EXIT
curl -fsSL "$url" -o "$tmp" 2>&1
actual_sha="$(sha256sum "$tmp" | awk '{print $1}')"
echo "Actual SHA256 of RPM-GPG-KEY-EPEL-9: $actual_sha"
echo "Empty content SHA256: $empty_sha"
if [[ "$actual_sha" == "$empty_sha" ]]; then
echo "ERROR: checksum equals empty-content hash; the file at this URL may be empty or inaccessible."
exit 1
else
echo "OK: The file has a valid checksum different from empty content."
echo "Correct checksum should be: $actual_sha"
fiRepository: opendatahub-io/notebooks
Length of output: 118
🏁 Script executed:
python3 <<'EOF'
import urllib.request
import hashlib
url = "https://dl.fedoraproject.org/pub/epel/RPM-GPG-KEY-EPEL-9"
empty_sha = "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
try:
with urllib.request.urlopen(url) as response:
content = response.read()
actual_sha = hashlib.sha256(content).hexdigest()
print(f"Actual SHA256 of RPM-GPG-KEY-EPEL-9: {actual_sha}")
print(f"Empty content SHA256: {empty_sha}")
print(f"File size: {len(content)} bytes")
if actual_sha == empty_sha:
print("ERROR: checksum equals empty-content hash; the file is empty or inaccessible.")
else:
print("OK: The file has a valid checksum different from empty content.")
print(f"Correct checksum for line 6 should be: {actual_sha}")
except Exception as e:
print(f"Error fetching URL: {e}")
EOFRepository: opendatahub-io/notebooks
Length of output: 215
Fix invalid checksum for EPEL key artifact.
Line 6 uses sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855, which is the hash of empty content. This will fail integrity verification and make hermetic prefetch validation unreliable for RPM-GPG-KEY-EPEL-9. Regenerate this lock entry with the actual downloaded file's digest.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@codeserver/ubi9-python-3.12/prefetch-input/odh/artifacts.lock.yaml` at line
6, The checksum entry for the artifact `RPM-GPG-KEY-EPEL-9` in the `checksum`
field is the empty-content SHA256 and must be replaced with the real digest:
re-download the actual RPM-GPG-KEY-EPEL-9 file, compute its sha256 (e.g., via
sha256sum or an equivalent tool), and update the `checksum` value (the
`checksum: sha256:...` line) with the computed hash so hermetic prefetch
integrity verification passes.
ba85c90 to
e8645f6
Compare
e8645f6 to
b862672
Compare
| "node_modules/undici": { | ||
| "version": "7.9.0", | ||
| "resolved": "https://registry.npmjs.org/undici/-/undici-7.9.0.tgz", | ||
| "integrity": "sha512-e696y354tf5cFZPXsF26Yg+5M63+5H3oE6Vtkh2oqbvsE2Oe7s2nIbcQh5lmG7Lp/eS29vJtTpw9+p6PX0qNSg==", | ||
| "license": "MIT", | ||
| "engines": { | ||
| "node": ">=20.18.1" | ||
| } | ||
| }, |
Check warning
Code scanning / Trivy
undici: Undici: Denial of Service via excessive decompression steps Medium library
5c10064 to
d7b8289
Compare
…hermetic build
- Add custom-packages (prefetch-input/patches/code-server-v4.106.3/custom-packages/)
for npm deps Konflux cannot fetch via git+ssh (@emmetio/css-parser, @parcel/watcher).
Provide HTTPS GitHub tarball URLs so prefetch stores them under deps/npm/;
rewrite-npm-urls.sh rewrites lockfile refs to file:///cachi2/... paths.
- Use a single ripgrep version (v13.0.0-13) for all four architectures:
- Add patched ripgrep/postinstall.js and apply in apply-patch.sh (overwrite
@vscode/ripgrep postinstall in cached tarball; strip integrity in
lib/vscode, remote, and build package-lock.json).
- Reduce ripgrep artifacts from 6 to 4 in odh/rhds artifacts.in.yaml and
regenerate artifacts.lock.yaml.
- Update setup-offline-binaries.sh and add ripgrep/VERIFICATION.md.
- Tekton: update codeserver-datascience-cpu-py312-ubi9 pipeline YAML as needed.
d7b8289 to
5b2969d
Compare
|
@ysok: The following tests failed, say
Full PR test history. Your PR dashboard. DetailsInstructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here. |
RHAIENG-2860: Move custom packages from artifact to .json files
Description
Add hermetic Konflux support: Tekton prefetch + custom-packages (codeload URLs),
drop generic artifacts for css-parser/watcher, update rewrite-npm-urls for
codeload and integrity; document Hermeto layout in custom-packages README.
How Has This Been Tested?
Self checklist (all need to be checked):
make test(gmakeon macOS) before asking for reviewDockerfile.konfluxfiles should be done inodh/notebooksand automatically synced torhds/notebooks. For Konflux-specific changes, modifyDockerfile.konfluxfiles directly inrhds/notebooksas these require special attention in the downstream repository and flow to the upcoming RHOAI release.Merge criteria:
Summary by CodeRabbit
Release Notes
New Features
Improvements
Documentation