Skip to content

Add torch device to list of offloadable types#1348

Merged
dsikka merged 2 commits intomainfrom
kylesayrs/torch-device-offload-cache
Apr 15, 2025
Merged

Add torch device to list of offloadable types#1348
dsikka merged 2 commits intomainfrom
kylesayrs/torch-device-offload-cache

Conversation

@kylesayrs
Copy link
Copy Markdown
Collaborator

Purpose

  • Squelch warning about torch device not being offloadable
llm-compressor/src/llmcompressor/pipelines/cache.py:168: UserWarning: Offloading not implemented for type <class 'torch.device'>.
  warnings.warn(f"Offloading not implemented for type {type(value)}.")

Signed-off-by: Kyle Sayers <kylesayrs@gmail.com>
@github-actions
Copy link
Copy Markdown

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@kylesayrs kylesayrs changed the title Add torch device to list of ignored types Add torch device to list of offloadable types Apr 14, 2025
Copy link
Copy Markdown
Collaborator

@shanjiaz shanjiaz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!

@kylesayrs kylesayrs added the ready When a PR is ready for review label Apr 14, 2025
@dsikka dsikka enabled auto-merge (squash) April 14, 2025 19:46
@dsikka dsikka disabled auto-merge April 15, 2025 00:32
@dsikka dsikka enabled auto-merge (squash) April 15, 2025 00:32
@dsikka dsikka disabled auto-merge April 15, 2025 00:32
@dsikka dsikka merged commit 42b62f5 into main Apr 15, 2025
9 of 10 checks passed
@dsikka dsikka deleted the kylesayrs/torch-device-offload-cache branch April 15, 2025 00:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready When a PR is ready for review

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants