Skip to content

Conversation

@a-r-r-o-w
Copy link
Contributor

Context: https://huggingface.slack.com/archives/C065E480NN9/p1734937335392109?thread_ts=1734935953.846629&cid=C065E480NN9

Fixes test: https://github.com/huggingface/diffusers/actions/runs/12461857058/job/34781970659#step:6:6623

This went uncaught in #10270 because it only happens when .to() is not called on a pipeline, or torch_dtype is not specified while loading transformer component. The failing lora test does neither, so the alternative branch of code was never tested.

pos_embedding is torch.int64, which casts embeds to torch.int64 as well unless casted correctly in both branches

@a-r-r-o-w a-r-r-o-w requested review from DN6 and sayakpaul December 23, 2024 07:51
@sayakpaul
Copy link
Member

I know you have run the concerned test on a GPU but just double-checking.

@a-r-r-o-w
Copy link
Contributor Author

Yep

image

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@DN6 DN6 merged commit 055d955 into main Dec 23, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants