Skip to content

Commit 3653dcc

Browse files
committed
loader
1 parent 8e2e1f1 commit 3653dcc

File tree

2 files changed

+6
-0
lines changed

2 files changed

+6
-0
lines changed

docs/source/en/api/loaders/lora.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ LoRA is a fast and lightweight training method that inserts and trains a signifi
2424
- [`SanaLoraLoaderMixin`] provides similar functions for [Sana](https://huggingface.co/docs/diffusers/main/en/api/pipelines/sana).
2525
- [`HunyuanVideoLoraLoaderMixin`] provides similar functions for [HunyuanVideo](https://huggingface.co/docs/diffusers/main/en/api/pipelines/hunyuan_video).
2626
- [`Lumina2LoraLoaderMixin`] provides similar functions for [Lumina2](https://huggingface.co/docs/diffusers/main/en/api/pipelines/lumina2).
27+
- [`HiDreamImageLoraLoaderMixin`] provides similar functions for [HiDream Image](https://huggingface.co/docs/diffusers/main/en/api/pipelines/hidream)
2728
- [`AmusedLoraLoaderMixin`] is for the [`AmusedPipeline`].
2829
- [`LoraBaseMixin`] provides a base class with several utility methods to fuse, unfuse, unload, LoRAs and more.
2930

@@ -73,6 +74,10 @@ To learn more about how to load LoRA weights, see the [LoRA](../../using-diffuse
7374

7475
[[autodoc]] loaders.lora_pipeline.Lumina2LoraLoaderMixin
7576

77+
## HiDreamImageLoraLoaderMixin
78+
79+
[[autodoc]] loaders.lora_pipeline.HiDreamImageLoraLoaderMixin
80+
7681
## AmusedLoraLoaderMixin
7782

7883
[[autodoc]] loaders.lora_pipeline.AmusedLoraLoaderMixin

src/diffusers/loaders/peft.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,7 @@
5555
"Lumina2Transformer2DModel": lambda model_cls, weights: weights,
5656
"WanTransformer3DModel": lambda model_cls, weights: weights,
5757
"CogView4Transformer2DModel": lambda model_cls, weights: weights,
58+
"HiDreamImageTransformer2DModel": lambda model_cls, weights: weights,
5859
}
5960

6061

0 commit comments

Comments
 (0)