-
Couldn't load subscription status.
- Fork 6.5k
Open
Description
diffusers/src/diffusers/models/attention.py
Line 244 in 764b624
| _ = xops.memory_efficient_attention(q, q, q) |
It seems that the latest xformers has the memory_efficient_attention function under xops.ops and not xops in this case.
xops.version
'0.0.32.post2'
Maybe "import xformers as xops" should be "import xformers.ops as xops" ?
For reference:
https://github.com/facebookresearch/xformers/blob/c159edc05ae5a0192ab0558e834b946155790371/xformers/ops/fmha/__init__.py#L186
Metadata
Metadata
Assignees
Labels
No labels