Skip to content

Conversation

@leisuzz
Copy link
Contributor

@leisuzz leisuzz commented Aug 21, 2025

What does this PR do?

Change the _attention_backend to NPU by using enable npu flash attention method.

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@leisuzz
Copy link
Contributor Author

leisuzz commented Aug 21, 2025

Hi @sayakpaul,
please take a look at this PR. Thanks :)

@leisuzz leisuzz changed the title NPU attention refactor for FLUX transformer NPU attention refactor for FLUX Aug 25, 2025
@leisuzz
Copy link
Contributor Author

leisuzz commented Aug 25, 2025

Cc: @sayakpaul

@leisuzz
Copy link
Contributor Author

leisuzz commented Aug 26, 2025

@a-r-r-o-w Please take a look, I've updated the code based on your comment. Thanks!

Copy link
Contributor

@a-r-r-o-w a-r-r-o-w left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @leisuzz!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@a-r-r-o-w
Copy link
Contributor

@bot /style

@github-actions
Copy link
Contributor

github-actions bot commented Aug 26, 2025

Style bot fixed some files and pushed the changes.

@a-r-r-o-w a-r-r-o-w merged commit 0fd7ee7 into huggingface:main Aug 26, 2025
28 checks passed
@leisuzz leisuzz deleted the flux branch August 26, 2025 08:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants