Skip to content

Commit 5bb1890

Browse files
authored
[Gluon] Remove tcgen05_mma wrapper from attention tutorial (triton-lang#7536)
This was a workaround for an issue that has since been fixed.
1 parent 318ff2c commit 5bb1890

File tree

1 file changed

+1
-6
lines changed

1 file changed

+1
-6
lines changed

python/tutorials/gluon/01-attention-forward.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
tensor_memory_descriptor,
1717
tma,
1818
mbarrier,
19-
tcgen05_mma as _tcgen05_mma_impl,
19+
tcgen05_mma,
2020
tcgen05_commit,
2121
)
2222

@@ -173,11 +173,6 @@ def issue_async_tma_load(smem, bar, desc, offset):
173173
tma.async_copy_global_to_shared(desc, [offset, 0], bar, smem)
174174

175175

176-
@gluon.jit
177-
def tcgen05_mma(a, b, d, use_acc, mbarriers):
178-
_tcgen05_mma_impl(a, b, d, use_acc=use_acc, mbarriers=mbarriers, mbarrier_preds=[True] * len(mbarriers))
179-
180-
181176
# ===-----------------------------------------------------------------------===#
182177
# Gluon Attention
183178
# ===-----------------------------------------------------------------------===#

0 commit comments

Comments
 (0)