Skip to content

chore(Pipfiles): update odh-elyra to version 4.2.3 across all Pipfiles #1591

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Aug 2, 2025
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
109 changes: 65 additions & 44 deletions codeserver/ubi9-python-3.11/Pipfile.lock
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

jupyter/tensorflow/ubi9-python-3.11/requirements.txt (1)
1064-1066: Keras ≥3 in a TensorFlow 2.x image can shadow tf.keras

Installing standalone keras==3.11.1 next to tensorflow==2.18.1 means both
import keras (Keras-3) and import tensorflow.keras (TF-Keras) coexist.
Accidental import keras in user notebooks silently switches to the new API,
which is not 100 % compatible with TF-Keras.

If the image does not intentionally expose Keras-3, consider removing the wheel
or setting the default backend explicitly:

ENV KERAS_BACKEND="tensorflow"
to reduce surprises.

@coderabbitai I am confused about this. Please create github issue that describes the situation clearly.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

I'll create a comprehensive GitHub issue to clearly describe the Keras 3 / TensorFlow compatibility concern.

@jiridanek I've created a comprehensive GitHub issue that clearly describes the Keras 3 / TensorFlow compatibility situation. The issue covers:

  • Problem description: How having both Keras 3.x and tf.keras can cause confusion
  • Impact analysis: Potential for silent API switching and debugging confusion
  • Multiple solution options: From removing standalone Keras to setting explicit backends
  • Clear acceptance criteria: Specific steps to resolve the issue
  • Proper context: Links back to this PR and comment thread

The issue follows your established pattern of systematic technical improvement tracking with detailed problem descriptions, solution options, and implementation guidance.


🏁 Script executed:

#!/bin/bash
# Create GitHub issue for Keras 3 / TensorFlow compatibility problem
gh issue create \
  --title "Keras 3 / TensorFlow compatibility: Potential API confusion in TensorFlow notebook images" \
  --body "## Problem Description

The TensorFlow notebook images currently install both standalone Keras 3.x and TensorFlow 2.x, which creates a potential source of confusion and subtle bugs for users.

### Current Situation
- **TensorFlow version**: 2.18.1 (bundles tf.keras based on Keras 2.x API)
- **Standalone Keras version**: 3.11.1 (new Keras 3.x API)
- **Affected images**: All TensorFlow-based notebook images

### The Problem
When both packages are installed, users have access to two different Keras APIs:
1. \`import keras\` → Keras 3.x (new API, different behavior)
2. \`import tensorflow.keras\` → tf.keras (Keras 2.x API, TensorFlow's bundled version)

### Impact
- **Silent API switching**: Users who accidentally use \`import keras\` instead of \`import tensorflow.keras\` get different behavior
- **API incompatibilities**: Keras 3.x is not 100% compatible with tf.keras/Keras 2.x
- **Debugging confusion**: Errors may be confusing when mixing APIs
- **Documentation mismatch**: TensorFlow tutorials expect tf.keras, not standalone Keras 3.x

### Example Scenarios
\`\`\`python
# User expects TensorFlow's Keras (common in TF tutorials)
import keras  # ❌ Gets Keras 3.x instead of tf.keras
model = keras.Sequential([...])  # May behave differently than expected

# vs. intended usage
import tensorflow.keras as keras  # ✅ Gets tf.keras as expected
model = keras.Sequential([...])  # Behaves as TensorFlow tutorials show
\`\`\`

## Solution Options

### Option 1: Remove Standalone Keras (Recommended)
Remove \`keras==3.11.1\` from requirements.txt and rely solely on tf.keras bundled with TensorFlow.

**Pros:**
- Eliminates confusion
- Maintains consistency with TensorFlow ecosystem
- Follows TensorFlow best practices

**Cons:**
- Users cannot access Keras 3.x features
- May break existing notebooks that explicitly use Keras 3.x

### Option 2: Set Default Backend
Keep both but set explicit backend environment variable:
\`\`\`dockerfile
ENV KERAS_BACKEND=\"tensorflow\"
\`\`\`

**Pros:**
- Provides access to both APIs
- Makes default behavior explicit

**Cons:**
- Still allows for confusion
- Requires user awareness of the distinction

### Option 3: Documentation and Examples
Keep current setup but add clear documentation about the two APIs.

**Pros:**
- Maximum flexibility
- Educational value

**Cons:**
- Highest confusion potential
- Requires extensive documentation

## Acceptance Criteria

- [ ] Decision made on which solution approach to implement
- [ ] If removing standalone Keras: Update all affected requirements.txt files
- [ ] If keeping both: Add KERAS_BACKEND environment variable to Dockerfiles
- [ ] Update any example notebooks that might be affected
- [ ] Test that TensorFlow tutorials work as expected
- [ ] Document the chosen approach in README or relevant documentation

## Files Affected

Based on the current PR, at minimum:
- \`jupyter/tensorflow/ubi9-python-3.11/requirements.txt\`
- Potentially other TensorFlow image requirements.txt files
- Corresponding Dockerfiles if environment variables are added

## Context

- **Related PR**: #1591
- **Comment thread**: https://github.com/opendatahub-io/notebooks/pull/1591#discussion_r2248331257
- **Identified during**: Dependency update review

## References

- [Keras 3.0 Migration Guide](https://keras.io/guides/migrating_to_keras_3/)
- [TensorFlow Keras vs Standalone Keras](https://www.tensorflow.org/guide/keras)
" \
  --assignee jiridanek

Length of output: 3597


Perfect! I've successfully created GitHub issue #1594 that clearly describes the Keras 3 / TensorFlow compatibility situation. The issue provides:

Comprehensive Problem Analysis:

  • Clear explanation of how standalone Keras 3.x and tf.keras coexist
  • Specific version details (TensorFlow 2.18.1 vs Keras 3.11.1)
  • Real-world impact scenarios with code examples

Multiple Solution Approaches:

  1. Remove standalone Keras (recommended)
  2. Set explicit backend environment variable
  3. Documentation-based approach

Implementation Details:

  • Clear acceptance criteria with checkboxes
  • Affected files identification
  • Proper context linking to this PR and comment thread

The issue is now tracked as #1594 and assigned to you for decision-making on the best approach to resolve this potential API confusion in TensorFlow notebook images.


✏️ Learnings added
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1591
File: codeserver/ubi9-python-3.11/Pipfile.lock:1-1
Timestamp: 2025-08-01T16:03:18.704Z
Learning: jiridanek requested GitHub issue creation for Keras 3 / TensorFlow API compatibility problem during PR #1591 review. Issue #1594 was successfully created with comprehensive problem description covering API confusion between standalone keras==3.11.1 and tf.keras bundled with tensorflow==2.18.1, multiple solution options (remove standalone Keras, set KERAS_BACKEND, or documentation), clear acceptance criteria, affected files analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.


🧠 Learnings used
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-09T08:07:24.937Z
Learning: jiridanek requested GitHub issue creation for tensorflow_rocm Python 3.12 compatibility problem during PR #1333 review. Issue #1354 was successfully created with comprehensive problem description covering missing cp312 wheels causing build failures, three solution options (upstream TensorFlow, Python 3.11 only, custom build), clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/datascience/ubi9-python-3.12/Pipfile:33-34
Timestamp: 2025-07-08T19:26:17.140Z
Learning: jiridanek requested GitHub issue creation for jupyter-client dependency pinning inconsistency during PR #1333 review, specifically asking to note the implications of breaking changes in 9.x versions. Issue #1343 was created with comprehensive problem description covering inconsistent pinning style across all Python 3.12 runtime images, detailed breaking changes analysis (kernel protocol, session management, connection security, API changes, async/await modifications), reproducibility and security impact assessment, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-14T15:36:29.815Z
Learning: ROCm is not supported by upstream TensorFlow, unlike CUDA. ROCm support requires the specialized tensorflow_rocm package, which creates a dependency challenge for Python 3.12 environments since tensorflow_rocm lacks Python 3.12 wheels while regular TensorFlow lacks ROCm support.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:1-3
Timestamp: 2025-07-03T12:07:19.365Z
Learning: jiridanek consistently requests GitHub issue creation for technical improvements identified during code reviews in opendatahub-io/notebooks, ensuring systematic tracking of code quality enhancements like shell script portability issues with comprehensive descriptions, solution options, and acceptance criteria.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:17-19
Timestamp: 2025-07-03T14:00:00.909Z
Learning: jiridanek efficiently identifies when CodeRabbit review suggestions are already covered by existing comprehensive issues, demonstrating excellent issue management and avoiding duplicate tracking of the same improvements across multiple locations.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1151
File: jupyter/tensorflow/ubi9-python-3.12/test/test_notebook.ipynb:31-34
Timestamp: 2025-07-01T07:03:05.385Z
Learning: jiridanek demonstrates excellent pattern recognition for identifying duplicated code issues across the opendatahub-io/notebooks repository. When spotting a potential problem in test notebooks, he correctly assesses that such patterns are likely replicated across multiple similar files rather than being isolated incidents, leading to more effective systematic solutions.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:56-66
Timestamp: 2025-07-02T18:19:49.397Z
Learning: jiridanek consistently creates comprehensive follow-up GitHub issues for security concerns raised during PR reviews in opendatahub-io/notebooks, ensuring systematic tracking and resolution of supply-chain security improvements like GPG signature verification for package repositories.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/test/test_notebook.ipynb:22-29
Timestamp: 2025-07-02T18:27:51.097Z
Learning: jiridanek consistently creates comprehensive follow-up GitHub issues from PR review comments in opendatahub-io/notebooks, turning specific code quality concerns into systematic improvements tracked with proper context, acceptance criteria, and cross-references. Issue #1266 demonstrates this pattern by expanding a specific error handling concern in load_expected_versions() into a repository-wide improvement initiative.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1379
File: .tekton/odh-workbench-jupyter-datascience-cpu-py312-ubi9-push.yaml:14-17
Timestamp: 2025-07-11T11:15:47.424Z
Learning: jiridanek requested GitHub issue creation for CEL filter problem in datascience workbench Tekton pipelines during PR #1379 review. Issue #1383 was successfully created with comprehensive problem description covering both Python 3.11 and 3.12 pipelines incorrectly watching jupyter/minimal directories instead of jupyter/datascience directories, detailed impact analysis of pipeline execution failures, complete solution with before/after code examples, thorough acceptance criteria for path updates and pipeline triggering verification, implementation notes about repository structure alignment, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:112-126
Timestamp: 2025-07-08T13:21:09.150Z
Learning: jiridanek requested GitHub issue creation for notebook linting and formatting improvements during PR #1306 review, specifically to address inconsistent metadata across .ipynb files and implement systematic quality standards. This continues the established pattern of comprehensive issue creation for code quality improvements with detailed problem descriptions, multiple solution options, phased acceptance criteria, and proper context linking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-08-01T14:46:03.215Z
Learning: jiridanek requested GitHub issue creation for two nitpicks during PR #1588 review: comment wording improvement in ROCm TensorFlow Dockerfile and typo fix in Jupyter DataScience Dockerfile stage header. Issues #1589 and #1590 were successfully created with comprehensive problem descriptions, specific file locations and line numbers, clear before/after solutions, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1325
File: jupyter/pytorch/ubi9-python-3.12/Pipfile:42-42
Timestamp: 2025-07-09T14:22:14.553Z
Learning: jiridanek requested GitHub issue creation for Pipfile.lock verification script implementation during PR #1325 review, specifically to systematize the manual verification process for dependency version consistency across all lock files using jq. Issue #1367 was created with comprehensive problem description covering manual verification challenges, detailed solution with jq-based verification script, enhanced features for CI integration, clear acceptance criteria, implementation areas breakdown, benefits analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/kfbuild odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

109 changes: 65 additions & 44 deletions codeserver/ubi9-python-3.11/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -403,41 +403,62 @@ kubeflow-training==1.9.3 \
kubernetes==33.1.0; python_version >= '3.6' \
--hash=sha256:544de42b24b64287f7e0aa9513c93cb503f7f40eea39b20f66810011a86eabc5 \
--hash=sha256:f64d829843a54c251061a8e7a14523b521f2dc5c896cf6d65ccf348648a88993
matplotlib==3.10.3; python_version >= '3.10' \
--hash=sha256:0ab1affc11d1f495ab9e6362b8174a25afc19c081ba5b0775ef00533a4236eea \
--hash=sha256:0ef061f74cd488586f552d0c336b2f078d43bc00dc473d2c3e7bfee2272f3fa8 \
--hash=sha256:151d89cb8d33cb23345cd12490c76fd5d18a56581a16d950b48c6ff19bb2ab93 \
--hash=sha256:213fadd6348d106ca7db99e113f1bea1e65e383c3ba76e8556ba4a3054b65ae7 \
--hash=sha256:24853dad5b8c84c8c2390fc31ce4858b6df504156893292ce8092d190ef8151d \
--hash=sha256:2a818d8bdcafa7ed2eed74487fdb071c09c1ae24152d403952adad11fa3c65b4 \
--hash=sha256:2f82d2c5bb7ae93aaaa4cd42aca65d76ce6376f83304fa3a630b569aca274df0 \
--hash=sha256:3ddbba06a6c126e3301c3d272a99dcbe7f6c24c14024e80307ff03791a5f294e \
--hash=sha256:4f23ffe95c5667ef8a2b56eea9b53db7f43910fa4a2d5472ae0f72b64deab4d5 \
--hash=sha256:55e46cbfe1f8586adb34f7587c3e4f7dedc59d5226719faf6cb54fc24f2fd52d \
--hash=sha256:68f7878214d369d7d4215e2a9075fef743be38fa401d32e6020bab2dfabaa566 \
--hash=sha256:6c7818292a5cc372a2dc4c795e5c356942eb8350b98ef913f7fda51fe175ac5d \
--hash=sha256:748302b33ae9326995b238f606e9ed840bf5886ebafcb233775d946aa8107a15 \
--hash=sha256:748ebc3470c253e770b17d8b0557f0aa85cf8c63fd52f1a61af5b27ec0b7ffee \
--hash=sha256:7c5f0283da91e9522bdba4d6583ed9d5521566f63729ffb68334f86d0bb98049 \
--hash=sha256:86ab63d66bbc83fdb6733471d3bff40897c1e9921cba112accd748eee4bce5e4 \
--hash=sha256:8c21ae75651c0231b3ba014b6d5e08fb969c40cdb5a011e33e99ed0c9ea86ecb \
--hash=sha256:9f2efccc8dcf2b86fc4ee849eea5dcaecedd0773b30f47980dc0cbeabf26ec84 \
--hash=sha256:a48f9c08bf7444b5d2391a83e75edb464ccda3c380384b36532a0962593a1751 \
--hash=sha256:a49e39755580b08e30e3620efc659330eac5d6534ab7eae50fa5e31f53ee4e30 \
--hash=sha256:a80fcccbef63302c0efd78042ea3c2436104c5b1a4d3ae20f864593696364ac7 \
--hash=sha256:c0b9849a17bce080a16ebcb80a7b714b5677d0ec32161a2cc0a8e5a6030ae220 \
--hash=sha256:c26dd9834e74d164d06433dc7be5d75a1e9890b926b3e57e74fa446e1a62c3e2 \
--hash=sha256:cb73d8aa75a237457988f9765e4dfe1c0d2453c5ca4eabc897d4309672c8e014 \
--hash=sha256:cf37d8c6ef1a48829443e8ba5227b44236d7fcaf7647caa3178a4ff9f7a5be05 \
--hash=sha256:cf4636203e1190871d3a73664dea03d26fb019b66692cbfd642faafdad6208e8 \
--hash=sha256:d3bec61cb8221f0ca6313889308326e7bb303d0d302c5cc9e523b2f2e6c73deb \
--hash=sha256:d96985d14dc5f4a736bbea4b9de9afaa735f8a0fc2ca75be2fa9e96b2097369d \
--hash=sha256:dbed9917b44070e55640bd13419de83b4c918e52d97561544814ba463811cbc7 \
--hash=sha256:ed70453fd99733293ace1aec568255bc51c6361cb0da94fa5ebf0649fdb2150a \
--hash=sha256:eef6ed6c03717083bc6d69c2d7ee8624205c29a8e6ea5a31cd3492ecdbaee1e1 \
--hash=sha256:f6929fc618cb6db9cb75086f73b3219bbb25920cb24cee2ea7a12b04971a4158 \
--hash=sha256:fd5641a9bb9d55f4dd2afe897a53b537c834b9012684c8444cc105895c8c16fd \
--hash=sha256:fdfa07c0ec58035242bc8b2c8aae37037c9a886370eef6850703d7583e19964b
matplotlib==3.10.5; python_version >= '3.10' \
--hash=sha256:00b6feadc28a08bd3c65b2894f56cf3c94fc8f7adcbc6ab4516ae1e8ed8f62e2 \
--hash=sha256:07442d2692c9bd1cceaa4afb4bbe5b57b98a7599de4dabfcca92d3eea70f9ebe \
--hash=sha256:080c3676a56b8ee1c762bcf8fca3fe709daa1ee23e6ef06ad9f3fc17332f2d2a \
--hash=sha256:160e125da27a749481eaddc0627962990f6029811dbeae23881833a011a0907f \
--hash=sha256:1f5f3ec4c191253c5f2b7c07096a142c6a1c024d9f738247bfc8e3f9643fc975 \
--hash=sha256:1fc0d2a3241cdcb9daaca279204a3351ce9df3c0e7e621c7e04ec28aaacaca30 \
--hash=sha256:1ff10ea43288f0c8bab608a305dc6c918cc729d429c31dcbbecde3b9f4d5b569 \
--hash=sha256:21a95b9bf408178d372814de7baacd61c712a62cae560b5e6f35d791776f6516 \
--hash=sha256:27f52634315e96b1debbfdc5c416592edcd9c4221bc2f520fd39c33db5d9f202 \
--hash=sha256:2efaf97d72629e74252e0b5e3c46813e9eeaa94e011ecf8084a971a31a97f40b \
--hash=sha256:33775bbeb75528555a15ac29396940128ef5613cf9a2d31fb1bfd18b3c0c0903 \
--hash=sha256:352ed6ccfb7998a00881692f38b4ca083c691d3e275b4145423704c34c909076 \
--hash=sha256:354204db3f7d5caaa10e5de74549ef6a05a4550fdd1c8f831ab9bca81efd39ed \
--hash=sha256:3967424121d3a46705c9fa9bdb0931de3228f13f73d7bb03c999c88343a89d89 \
--hash=sha256:3b80eb8621331449fc519541a7461987f10afa4f9cfd91afcd2276ebe19bd56c \
--hash=sha256:47a388908e469d6ca2a6015858fa924e0e8a2345a37125948d8e93a91c47933e \
--hash=sha256:48fe6d47380b68a37ccfcc94f009530e84d41f71f5dae7eda7c4a5a84aa0a674 \
--hash=sha256:4b4984d5064a35b6f66d2c11d668565f4389b1119cc64db7a4c1725bc11adffc \
--hash=sha256:4fa40a8f98428f789a9dcacd625f59b7bc4e3ef6c8c7c80187a7a709475cf592 \
--hash=sha256:525f6e28c485c769d1f07935b660c864de41c37fd716bfa64158ea646f7084bb \
--hash=sha256:52c6573dfcb7726a9907b482cd5b92e6b5499b284ffacb04ffbfe06b3e568124 \
--hash=sha256:56da3b102cf6da2776fef3e71cd96fcf22103a13594a18ac9a9b31314e0be154 \
--hash=sha256:5d4773a6d1c106ca05cb5a5515d277a6bb96ed09e5c8fab6b7741b8fcaa62c8f \
--hash=sha256:64c4535419d5617f7363dad171a5a59963308e0f3f813c4bed6c9e6e2c131512 \
--hash=sha256:6c49465bf689c4d59d174d0c7795fb42a21d4244d11d70e52b8011987367ac61 \
--hash=sha256:707f9c292c4cd4716f19ab8a1f93f26598222cd931e0cd98fbbb1c5994bf7667 \
--hash=sha256:77fab633e94b9da60512d4fa0213daeb76d5a7b05156840c4fd0399b4b818837 \
--hash=sha256:7e44cada61bec8833c106547786814dd4a266c1b2964fd25daa3804f1b8d4467 \
--hash=sha256:8a8da0453a7fd8e3da114234ba70c5ba9ef0e98f190309ddfde0f089accd46ea \
--hash=sha256:8b6b49167d208358983ce26e43aa4196073b4702858670f2eb111f9a10652b4b \
--hash=sha256:8dee65cb1424b7dc982fe87895b5613d4e691cc57117e8af840da0148ca6c1d7 \
--hash=sha256:903352681b59f3efbf4546985142a9686ea1d616bb054b09a537a06e4b892ccf \
--hash=sha256:94986a242747a0605cb3ff1cb98691c736f28a59f8ffe5175acaeb7397c49a5a \
--hash=sha256:95672a5d628b44207aab91ec20bf59c26da99de12b88f7e0b1fb0a84a86ff959 \
--hash=sha256:96ef8f5a3696f20f55597ffa91c28e2e73088df25c555f8d4754931515512715 \
--hash=sha256:97b9d6443419085950ee4a5b1ee08c363e5c43d7176e55513479e53669e88468 \
--hash=sha256:a17e57e33de901d221a07af32c08870ed4528db0b6059dce7d7e65c1122d4bea \
--hash=sha256:a23193db2e9d64ece69cac0c8231849db7dd77ce59c7b89948cf9d0ce655a3ce \
--hash=sha256:a277033048ab22d34f88a3c5243938cef776493f6201a8742ed5f8b553201343 \
--hash=sha256:a41bcb6e2c8e79dc99c5511ae6f7787d2fb52efd3d805fff06d5d4f667db16b2 \
--hash=sha256:a6b310f95e1102a8c7c817ef17b60ee5d1851b8c71b63d9286b66b177963039e \
--hash=sha256:ac3d50760394d78a3c9be6b28318fe22b494c4fcf6407e8fd4794b538251899b \
--hash=sha256:b072aac0c3ad563a2b3318124756cb6112157017f7431626600ecbe890df57a1 \
--hash=sha256:b5fa2e941f77eb579005fb804026f9d0a1082276118d01cc6051d0d9626eaa7f \
--hash=sha256:ba6c3c9c067b83481d647af88b4e441d532acdb5ef22178a14935b0b881188f4 \
--hash=sha256:c04cba0f93d40e45b3c187c6c52c17f24535b27d545f757a2fffebc06c12b98b \
--hash=sha256:c61333a8e5e6240e73769d5826b9a31d8b22df76c0778f8480baf1b4b01c9420 \
--hash=sha256:ceefe5d40807d29a66ae916c6a3915d60ef9f028ce1927b84e727be91d884369 \
--hash=sha256:d52fd5b684d541b5a51fb276b2b97b010c75bee9aa392f96b4a07aeb491e33c7 \
--hash=sha256:dc88af74e7ba27de6cbe6faee916024ea35d895ed3d61ef6f58c4ce97da7185a \
--hash=sha256:dcfc39c452c6a9f9028d3e44d2d721484f665304857188124b505b2c95e1eecf \
--hash=sha256:e4a6470a118a2e93022ecc7d3bd16b3114b2004ea2bf014fff875b3bc99b70c6 \
--hash=sha256:ee7a09ae2f4676276f5a65bd9f2bd91b4f9fbaedf49f40267ce3f9b448de501f \
--hash=sha256:ee98a5c5344dc7f48dc261b6ba5d9900c008fc12beb3fa6ebda81273602cc389 \
--hash=sha256:f6adb644c9d040ffb0d3434e440490a66cf73dbfa118a6f79cd7568431f7a012
matplotlib-inline==0.1.7; python_version >= '3.8' \
--hash=sha256:8423b23ec666be3d16e16b60bdd8ac4e86e840ebd1dd11a30b9f117f2fa0ab90 \
--hash=sha256:df192d39a4ff8f21b1895d72e6a13f5fcc5099f00fa84384e0ea28c2cc0653ca
Expand Down Expand Up @@ -747,15 +768,15 @@ ptyprocess==0.7.0 \
pure-eval==0.2.3 \
--hash=sha256:1db8e35b67b3d218d818ae653e27f06c3aa420901fa7b081ca98cbedc874e0d0 \
--hash=sha256:5f4e983f40564c576c7c8635ae88db5956bb2229d7e9237d03b3c0b0190eaf42
py-spy==0.4.0 \
--hash=sha256:47cdda4c34d9b6cb01f3aaeceb2e88faf57da880207fe72ff6ff97e9bb6cc8a9 \
--hash=sha256:77d8f637ade38367d944874776f45b703b7ac5938b1f7be8891f3a5876ddbb96 \
--hash=sha256:806602ce7972782cc9c1e383f339bfc27bfb822d42485e6a3e0530ae5040e1f0 \
--hash=sha256:87573e64dbfdfc89ba2e0f5e2f525aa84e0299c7eb6454b47ea335fde583a7a0 \
--hash=sha256:8bf2f3702cef367a489faa45177b41a6c31b2a3e5bd78c978d44e29340152f5a \
--hash=sha256:c5f06ffce4c9c98b7fc9f5e67e5e7db591173f1351837633f3f23d9378b1d18a \
--hash=sha256:eee3d0bde85ca5cf4f01f012d461180ca76c24835a96f7b5c4ded64eb6a008ab \
--hash=sha256:f2cf3f7130e7d780471faa5957441d3b4e0ec39a79b2c00f4c33d494f7728428
py-spy==0.4.1 \
--hash=sha256:1fb8bf71ab8df95a95cc387deed6552934c50feef2cf6456bc06692a5508fd0c \
--hash=sha256:4972c21890b6814017e39ac233c22572c4a61fd874524ebc5ccab0f2237aee0a \
--hash=sha256:532d3525538254d1859b49de1fbe9744df6b8865657c9f0e444bf36ce3f19226 \
--hash=sha256:6a80ec05eb8a6883863a367c6a4d4f2d57de68466f7956b6367d4edd5c61bb29 \
--hash=sha256:809094208c6256c8f4ccadd31e9a513fe2429253f48e20066879239ba12cd8cc \
--hash=sha256:d92e522bd40e9bf7d87c204033ce5bb5c828fca45fa28d970f58d71128069fdc \
--hash=sha256:e53aa53daa2e47c2eef97dd2455b47bb3a7e7f962796a86cc3e7dbde8e6f4db4 \
--hash=sha256:ee776b9d512a011d1ad3907ed53ae32ce2f3d9ff3e1782236554e22103b5c084
pyasn1==0.6.1; python_version >= '3.8' \
--hash=sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629 \
--hash=sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034
Expand Down
Loading
Loading