-
Notifications
You must be signed in to change notification settings - Fork 13.9k
common: Include torch package for s390x #13699
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Aaron Teo <[email protected]>
Signed-off-by: Aaron Teo <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This pull request aims to address the torch package installation issue on s390x architectures by conditionally installing the torch package from the nightly build.
- Split the torch dependency based on architecture using environment markers.
- Added an extra-index-url for nightly torch builds for s390x in both the hf and lora conversion requirements files.
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
| requirements/requirements-convert_lora_to_gguf.txt | Added comment and nightly extra-index-url for s390x packages. |
| requirements/requirements-convert_hf_to_gguf_update.txt | Conditioned torch version requirements and extra-index-url for s390x to use the nightly build. |
| requirements/requirements-convert_hf_to_gguf.txt | Mirrored the changes in torch dependency handling for s390x as in the update file. |
| @@ -1,2 +1,4 @@ | |||
| -r ./requirements-convert_hf_to_gguf.txt | |||
| --extra-index-url https://download.pytorch.org/whl/cpu | |||
| # torch s390x packages can only be found from nightly builds | |||
Copilot
AI
May 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider expanding this comment to clarify that the nightly extra-index-url is added for transitive dependency resolution on the s390x architecture, even though the torch requirement is not directly specified in this file.
| -r ./requirements-convert_legacy_llama.txt | ||
| --extra-index-url https://download.pytorch.org/whl/cpu | ||
| torch~=2.2.1 | ||
| torch~=2.2.1; platform_machine != "s390x" |
Copilot
AI
May 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] It may be helpful to briefly note in an inline comment why the torch version is pinned with '~=' for non-s390x platforms, for better clarity on the dependency strategy.
|
|
||
| # torch s390x packages can only be found from nightly builds | ||
| --extra-index-url https://download.pytorch.org/whl/nightly | ||
| torch>=0.0.0.dev0; platform_machine == "s390x" |
Copilot
AI
May 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] If a stable torch version becomes available for s390x in the future, consider updating this nightly dependency to a pinned version to avoid unexpected updates.
| torch>=0.0.0.dev0; platform_machine == "s390x" | |
| torch>=0.0.0.dev0; platform_machine == "s390x" # TODO: Update to a pinned stable version when available |
| -r ./requirements-convert_legacy_llama.txt | ||
| --extra-index-url https://download.pytorch.org/whl/cpu | ||
| torch~=2.2.1 | ||
| torch~=2.2.1; platform_machine != "s390x" |
Copilot
AI
May 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Similarly, a brief inline comment explaining the rationale for the conditional torch dependency on non-s390x platforms could improve clarity for future maintainers.
|
|
||
| # torch s390x packages can only be found from nightly builds | ||
| --extra-index-url https://download.pytorch.org/whl/nightly | ||
| torch>=0.0.0.dev0; platform_machine == "s390x" |
Copilot
AI
May 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Consider reviewing and possibly pinning the nightly torch version for s390x if a more stable version becomes available, to ensure predictable dependency behavior.
This pull request introduces the
torchpackage intorequirements.txtfor the s390x architecture.Problem
At current, the
torchpackage for the s390x architecture cannot be found on PyTorch's stable build; only the nightly builds include wheels for s390x.As a result, trying to install packages using the
requirements.txtwould fail. Otherwise, conda would have to be used to resolve thetorchdependency issue.Additionally, going the conda route would require further manual patches as documented in my blog here: https://medium.com/@taronaeo/running-any-llm-on-mainframes-s390x-d5acc56ea649#:~:text=Step%205%3A%20Install%20Required%20Python%20Packages
Resolution
This PR segregates the
torch~=2.2.1requirement from all other platforms by using the requirement specifiers and only installstorchfrom the nightly build for the s390x architecture.This effectively removes the friction of installing packages from
requirements.txton the s390x architecture.Verification
To ensure that this PR did not break anything, the
pip3 install -r requirements.txtcommand has been tested on the following machines:Note
Tests were conducted on an IBM z15 Mainframe with 16 IFLs (cores) and 160 GB Memory on an LPAR.
Please review this pull request and consider merging into the main repository. Thank you!