Commit eb99fa1
authored
[release/2.7] Fix for flex attention tuning (#2589)
Bug fix after #2392 landed
Issue caused from bad merge conflict resolution, resulting in the code
using an outdated API.
> torch._inductor.exc.LoweringException: NameError: name
'_get_default_config_bwd' is not defined
target: flex_attention_backward
Models now run to completion1 parent 59925f5 commit eb99fa1
1 file changed
+0
-14
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
2476 | 2476 | | |
2477 | 2477 | | |
2478 | 2478 | | |
2479 | | - | |
2480 | | - | |
2481 | | - | |
2482 | | - | |
2483 | | - | |
2484 | | - | |
2485 | | - | |
2486 | | - | |
2487 | | - | |
2488 | | - | |
2489 | | - | |
2490 | | - | |
2491 | | - | |
2492 | | - | |
2493 | 2479 | | |
2494 | 2480 | | |
2495 | 2481 | | |
| |||
0 commit comments