-
Notifications
You must be signed in to change notification settings - Fork 206
Updated dependencies in Windows examples #622
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Hrishith Thadicherla <[email protected]>
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #622 +/- ##
=======================================
Coverage 74.64% 74.64%
=======================================
Files 183 183
Lines 18542 18542
=======================================
Hits 13840 13840
Misses 4702 4702 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
| torch==2.7.0+cu128 | ||
| torchaudio==2.7.0+cu128 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we update this torch dependencies to 2.9 as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason why we kept these torch dependencies to 2.7, is because torchaudio >= 2.8 requires ffmpeg binaries which cannot be installed by pip and needs to be installed manually. Since torchaudio >= 2.8 requires torch >=2.8 we cannot upgrade torch as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have written steps to upgrade torch dependencies in the README though.
| onnx==1.18.0 | ||
| onnxruntime-gpu==1.20.1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use onnx 1.19 and ort 1.23?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think there should be an issue upgrading onnx to 1.19 but i need to test the scripts for onnxruntime-gpu. Will update in comment, if it works.
|
If we no longer use ort-dml for windows, please update requirements in setup.py as well: https://github.com/NVIDIA/TensorRT-Model-Optimizer/blob/main/setup.py#L53 |
What does this PR do?
Type of change: ? Bug Fix
Overview: Updated torch and transformers to latest versions for normal quantization examples.
For whisper quantization, updated README with steps to install and enable latest torch and torchaudio.
Testing
Tested quantization and MMLU benchmarks with updated torch and transformers version. It was working as expected.