Skip to content

Conversation

@yiyixuxu
Copy link
Collaborator

No description provided.

"Accelerate hooks detected. Since you have called `load_lora_weights()`, the previous hooks will be first removed. Then the LoRA parameters will be loaded and the hooks will be applied again."
)
remove_hook_from_module(component, recurse=is_sequential_cpu_offload)
if is_sequential_cpu_offload or is_model_cpu_offload:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @sayakpaul
can you double check here? I'm not sure why we need to remove the hooks, but since we only add back sequential_cpu and model_cpu hooks (

if is_model_cpu_offload:
), we should only remove these 2 types

@yiyixuxu yiyixuxu requested a review from sayakpaul June 22, 2025 11:09
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch! Should we add a test as well (I can tackle that)?

@yiyixuxu
Copy link
Collaborator Author

@sayakpaul sounds good

@sayakpaul
Copy link
Member

Feel free to merge then :)

@yiyixuxu yiyixuxu merged commit 7bc0a07 into main Jun 24, 2025
32 of 33 checks passed
@yiyixuxu yiyixuxu deleted the update-remove-hooks branch June 24, 2025 02:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants