Skip to content

Conversation

@samuelt0
Copy link
Contributor

@samuelt0 samuelt0 commented Aug 4, 2025

What does this PR do?

Adds the cross attention module to the Wan Attention Class

Fixes # (issue)
Mirrors the previous Attention class and allows us to distinguish between attention instances

@samuelt0
Copy link
Contributor Author

samuelt0 commented Aug 4, 2025

@sayakpaul

Copy link
Contributor

@a-r-r-o-w a-r-r-o-w left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I was going to do the same today haha!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@a-r-r-o-w
Copy link
Contributor

@samuelt0 Could you run make style and push the changes? Thanks

@a-r-r-o-w
Copy link
Contributor

@bot /style

@github-actions
Copy link
Contributor

github-actions bot commented Aug 4, 2025

Style bot fixed some files and pushed the changes.

@a-r-r-o-w
Copy link
Contributor

Failing test is unrelated

@a-r-r-o-w a-r-r-o-w merged commit 11d22e0 into huggingface:main Aug 4, 2025
10 of 11 checks passed
Beinsezii pushed a commit to Beinsezii/diffusers that referenced this pull request Aug 7, 2025
* Cross attention module to Wan Attention

* Apply style fixes

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Aryan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants