Skip to content

Commit 65f7671

Browse files
[TEST] Fix test_mxfp failures from 8de17d2
Signed-off-by: Whitney Tsang <[email protected]>
1 parent 9735dd4 commit 65f7671

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

python/test/unit/language/test_matmul.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -375,6 +375,8 @@ def test_mxfp(M, N, K, BLOCK_M, BLOCK_N, BLOCK_K, NUM_STAGES, nonKDim, NUM_WARPS
375375
atol = 0.0001
376376
rtol = 0.0001
377377
torch.testing.assert_close(ref_out, output, atol=atol, rtol=rtol)
378+
if not is_cuda():
379+
return
378380

379381
# Pipelining of dot_scaled requires tmem_copy to be used, which in turn
380382
# requires the scales to be in the blocked layout in global memory.

0 commit comments

Comments
 (0)