-
Couldn't load subscription status.
- Fork 6.4k
[LoRA] make set_adapters() robust on silent failures.
#9618
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
| @unittest.skip("Not supported in CogVideoX.") | ||
| def test_simple_inference_with_text_denoiser_multi_adapter_block_lora(self): | ||
| pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that we're catching the error appropriately within the code, we should skip this test for unsupported models.
| def test_modify_padding_mode(self): | ||
| pass | ||
|
|
||
| @unittest.skip("Not supported in Flux.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| gc.collect() | ||
| torch.cuda.empty_cache() | ||
|
|
||
| @is_flaky |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's absolutely no reason for it to be flaky but I think okay for now.
| def test_modify_padding_mode(self): | ||
| pass | ||
|
|
||
| @is_flaky |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as #9618 (comment).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea to check this, LGTM. I have some suggestions for the test, please check if they make sense.
Apart from that, I'm wondering if we still need the checks starting here:
Or are those now covered by the newly introduced check and can be removed?
|
@BenjaminBossan thanks!
Good catch. Resolved in 2714043. |
|
@DN6 could you give this a look? |
|
@DN6 a gentle ping. |
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
|
@DN6 I have changed from raising error to raising warnings. Additionally, we're removing the invalid components from the |
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
|
@hlky could you give this a look? |
What does this PR do?
Currently, if we do
where
pipeis an instance of theFluxPipelineit doesn't error out whereas it should because Flux doesn't have any UNet and itstext_encoder_2component isn't LoRA-loadable:diffusers/src/diffusers/loaders/lora_pipeline.py
Line 1650 in 31058cd
Instead, we silently ignore things. This PR fixes this behavior.
Thanks to @asomoza for the idea in #9542 (comment)!