This repository was archived by the owner on Sep 10, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 248
[Easy?] Numpy Version Pin Bump: == 2.0 #1472
Copy link
Copy link
Closed
Labels
ExecuTorchIssues related to ExecuTorch installation, export, or build. Mobile uses separate tagsIssues related to ExecuTorch installation, export, or build. Mobile uses separate tagsKnown GapsThese are known Gaps/Issues/Bug items in torchchatThese are known Gaps/Issues/Bug items in torchchatactionableItems in the backlog waiting for an appropriate impl/fixItems in the backlog waiting for an appropriate impl/fixgood first issueGood for newcomersGood for newcomerstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🐛 Describe the bug
ExecuTorch recently bumped their numpy requirements to numpy == 2.0 in pytorch/executorch@a7b5297
This puts torchchat in a finicky spot since the current requirements are tied to under 2.0 due to GGUF support requiring < 2.0 (see blame for previous attempts)
torchchat/install/requirements.txt
Lines 19 to 20 in fb65b8b
| # numpy version range required by GGUF util | |
| numpy >= 1.17, < 2.0 |
While not actively an issue, as soon as an ExecuTorch pin bump is required, this will become a hard blocker.
Task: Make a requirements version pinbump numpy >= 2.0numpy > 1.17
- Considerations:
- 1/23: This may "just work" without additional effort; Seems like llama.cpp may have fixed this in December ([gguf-py] gguf_reader: numpy 2 newbyteorder fix ggml-org/llama.cpp#9772)
- Is updating the GGUF support within torchchat's control (i.e. propogating dependencies)?
If not who/what needs coercing? - Should be icebox GGUF support?
Versions
N/A
Metadata
Metadata
Assignees
Labels
ExecuTorchIssues related to ExecuTorch installation, export, or build. Mobile uses separate tagsIssues related to ExecuTorch installation, export, or build. Mobile uses separate tagsKnown GapsThese are known Gaps/Issues/Bug items in torchchatThese are known Gaps/Issues/Bug items in torchchatactionableItems in the backlog waiting for an appropriate impl/fixItems in the backlog waiting for an appropriate impl/fixgood first issueGood for newcomersGood for newcomerstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module