Skip to content

Commit ae0b77c

Browse files
committed
fix patch
1 parent e7b9dc1 commit ae0b77c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

onnx_diagnostic/torch_export_patches/patches/patch_transformers.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1313,7 +1313,7 @@ def patched_sdpa_attention_forward(
13131313
is_causal = attention_mask is None and is_causal
13141314

13151315
torch._check(
1316-
attention_mask.shape[3] == key.shape[2],
1316+
attention_mask is None or attention_mask.shape[3] == key.shape[2],
13171317
"Attention mask shape incompatible with key shape.",
13181318
)
13191319
attn_output = torch.nn.functional.scaled_dot_product_attention(

0 commit comments

Comments
 (0)