Commit d1c90a0
authored
[SWDEV-538312] Fix num_stages=8 configs for flex attention (#2277)
num_stages==8 configs are always skipped causing breakages
Example error:
```
torch._inductor.exc.LoweringException: NoValidChoicesError: No choices to select, please consider adding ATEN into max_autotune_gemm_backends config (defined in torch/_inductor/config.py) to allow at least one choice.
target: flex_attention_backward
```1 parent eb37e58 commit d1c90a0
1 file changed
+0
-3
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
2273 | 2273 | | |
2274 | 2274 | | |
2275 | 2275 | | |
2276 | | - | |
2277 | | - | |
2278 | | - | |
2279 | 2276 | | |
2280 | 2277 | | |
2281 | 2278 | | |
| |||
0 commit comments