-
Notifications
You must be signed in to change notification settings - Fork 6.6k
[wip] [core] introduce pipeline-specific mixins to hold common methods #12322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
DN6
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good 👍🏽 Still on the fence on whether we use multiple levels of inheritance with QwenImageMixin. But other than that this looks good to me.
Perhaps we try with to do the same with Flux and SDXL to see if some common patterns show up for these Mixins.
With you. I am also on the fence. Since the mixins for QwenImage and QwenImageEdit primarily differ in But we can definitely uniformize the
Yes, I will work on that. |
|
|
||
| return split_result | ||
|
|
||
| def _get_qwen_prompt_embeds( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Decided to unify it to further reduce LOC. It's a private method anyway, so, we should be safe.
|
|
||
|
|
||
| class QwenImagePipelineMixin(QwenImageMixin): | ||
| def encode_prompt( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only differing encode_prompt() to not introduce any breaking change.
|
@DN6 pushed updates for Flux. LMK what you think. |
What does this PR do?
As discussed internally, this PR introduces pipeline-specific Mixin classes that hold common methods shared across different task-specific pipelines. The actual pipelines (such as
StableDiffusionPipeline,QwenImagePipeline,QwenImageImg2ImgPipeline, etc.) would then subclass from these mixins.Examples of such common methods vary from pipeline to pipeline, but some examples include
encode_prompt(), properties such asguidance_scale, etc.Additionally, we have a couple of methods like
retrieve_latents()andretrieve_timesteps()originated instable_diffusionand we copied them over to other pipelines that use these methods. Same forcalculate_shiftthat originated influx. This PR also considers this situation and follows a reasonable approach to treat it (more below).I know we also discussed moving the specific LoraLoaderMixins to their respective pipeline modules. The PR only does that for Stable Diffusion to gather feedback. Please keep on reading.
Note
This PR introduces Mixin classes for four popular pipelines: Flux, Qwen, SDXL, and SD. Once the PR is merged, we could work with the community to do this for other influential pipelines.
Guiding principles
A sample structure for
src/diffusers/pipelines/flux:pipeline_flux_utils.pyhere has the Mixin class, holding the common methods. It additionally has thecalculate_shift()method, which is imported by the other pipelines undersrc/diffusers/pipelines/flux. Those pipelines won't have to maintain# Copied from ...versions of thecalculate_shift()method anymore. They can just import likefrom .pipeline_flux_utils import calculate_shift. This is how we reduce LOCs, further on an intra-pipeline basis.However, if a new pipeline (other than Flux), wants to leverage
calculate_shift(), it should still follow# Copied from ...versions of thecalculate_shift()instead of a direct import. This way, we can let individual pipeline-level methods evolve independently without introducing complex dependency patterns. For example,pipeline_flux_utils.pyuses the# Copied from ...versions of methods likeretrieve_latents().LoRA
I experimented with moving
StableDiffusionLoraLoaderMixintosrc/diffusers/pipelines/stable_diffusion/lora_utils.py. However, there are a couple of derivative SD pipelines that subclass fromStableDiffusionLoraLoaderMixin(imported fromlora_pipeline.py).So, that will deprecation warnings for those. Two approaches come to mind:
from ..pipelines.stable_diffusion.lora_utils import StableDiffusionLoraLoaderMixin. I don't like it at all because the dependency pattern is bad."# Copied from ...version ofStableDiffusionLoraLoaderMixininside alora_utils.pythe respective pipeline module.src/diffusers/pipelines/animatediff, for example. I prefer this one.WDYT?
Also, if we decided to deprecate
StableDiffusionLoraLoaderMixinfromdiffusers.loaders.lora_pipeline, it would lead to a circular import problem:https://github.com/huggingface/diffusers/actions/runs/20021061502/job/57407882632?pr=12322#step:16:80
So, if we decided to go this route of separating the LoRA loaders, this might have to be a breaking change.
Notes
encode_prompt()varying while unifying the_get_qwen_prompt_embeds()method inQwenImageMixin. Comments are in line.prepare_image()is shared byQwenImageControlNetInpaintPipelineandQwenImageControlNetPipelinebut not others. We can choose to move them topipeline_qwen_utils.pybut I am not strongly opinionated on this. The same applies to methods likeprepare_mask_latents(). My preference here is to keep them as is for now.We can take similar principles for Flux, too. Flux Control and ControlNet pipelines have a
prepare_image()method. So, we could have something likeFluxControlMixin(FluxMixin)and includeprepare_image()there. But I felt like it was getting complex so, decided to skip.Additional questions
We have
diffusers/src/diffusers/pipelines/pipeline_utils.py
Line 2175 in 6bf668c
This PR also has a similar Mixin class but without the methods from the above Mixin. To avoid confusion, I have named it
SDMixin.WDYT about collating the two classes (
StableDiffusionMixinandSDMixin(introduced in this PR)) into one?