Skip to content

Commit 1312e26

Browse files
Mogballwhitneywhtsang
authored andcommitted
[Tutorial] Also make sure to test warp specialize variant of attention (#6973)
1 parent 334186f commit 1312e26

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

python/tutorials/06-fused-attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -578,7 +578,7 @@ def backward(ctx, do):
578578
(4, 48, 4096, 64),
579579
])
580580
@pytest.mark.parametrize("causal", [True])
581-
@pytest.mark.parametrize("warp_specialize", [False])
581+
@pytest.mark.parametrize("warp_specialize", [False, True])
582582
def test_op(Z, H, N_CTX, HEAD_DIM, causal, warp_specialize, dtype=torch.float16):
583583
torch.manual_seed(20)
584584
q = (torch.empty((Z, H, N_CTX, HEAD_DIM), dtype=dtype, device=DEVICE).normal_(mean=0.0, std=0.5).requires_grad_())

0 commit comments

Comments
 (0)