You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[cuBLASLt][FP8] cuBLASLt appears to support float8 rowwise-scaling on H100 (pytorch#161305)
Following pytorch#157905 I think the macro around
```
TORCH_INTERNAL_ASSERT(use_rowwise == false, "rowwise scaled_gemm not supported with blaslt");
```
was never updated and this would cause `float8` tests to fail. Also it appears the `Lt` accepts two inputs with `e4m3` and `e5m2` dtypes simultaneously, so removing that check here as well...
CC @lw
Pull Request resolved: pytorch#161305
Approved by: https://github.com/Skylion007, https://github.com/drisspg, https://github.com/jeffdaily
Co-authored-by: Jeff Daily <[email protected]>
0 commit comments