Skip to content

Commit 7167fc4

Browse files
committed
make style
1 parent d794ab5 commit 7167fc4

File tree

2 files changed

+4
-3
lines changed

2 files changed

+4
-3
lines changed

src/diffusers/loaders/ip_adapter.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -386,8 +386,8 @@ def load_ip_adapter(
386386
image_encoder_pretrained_model_name_or_path (`str`, *optional*, defaults to `./image_encoder`):
387387
Can be either:
388388
389-
- A string, the *model id* (for example `openai/clip-vit-large-patch14`) of a pretrained model hosted on
390-
the Hub.
389+
- A string, the *model id* (for example `openai/clip-vit-large-patch14`) of a pretrained model
390+
hosted on the Hub.
391391
- A path to a *directory* (for example `./my_model_directory`) containing the model weights saved
392392
with [`ModelMixin.save_pretrained`].
393393
cache_dir (`Union[str, os.PathLike]`, *optional*):

src/diffusers/pipelines/flux/pipeline_flux.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -756,7 +756,8 @@ def __call__(
756756
Pre-generated image embeddings for IP-Adapter. It should be a list of length same as number of
757757
IP-adapters. Each element should be a tensor of shape `(batch_size, num_images, emb_dim)`. If not
758758
provided, embeddings are computed from the `ip_adapter_image` input argument.
759-
negative_ip_adapter_image: (`PipelineImageInput`, *optional*): Optional image input to work with IP Adapters.
759+
negative_ip_adapter_image:
760+
(`PipelineImageInput`, *optional*): Optional image input to work with IP Adapters.
760761
negative_ip_adapter_image_embeds (`List[torch.Tensor]`, *optional*):
761762
Pre-generated image embeddings for IP-Adapter. It should be a list of length same as number of
762763
IP-adapters. Each element should be a tensor of shape `(batch_size, num_images, emb_dim)`. If not

0 commit comments

Comments
 (0)