Skip to content

Commit 14beb73

Browse files
committed
oops
1 parent 8d0846a commit 14beb73

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

native_sparse_attention_pytorch/native_sparse_attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -290,7 +290,7 @@ def forward(
290290
ck = cat((mem_ck, ck), dim = -2)
291291
cv = cat((mem_cv, cv), dim = -2)
292292

293-
ck, cv = tuple(repeat(t, 'b h ... -> b (num_grouped_queries h) ...', num_grouped_queries = self.num_grouped_queries) for t in (ck, cv))
293+
ck, cv = tuple(repeat(t, 'b h ... -> b (h num_grouped_queries) ...', num_grouped_queries = self.num_grouped_queries) for t in (ck, cv))
294294

295295
csim = einsum(q, ck, 'b h i d, b h j d -> b h i j') * self.scale
296296

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "native-sparse-attention-pytorch"
3-
version = "0.0.33"
3+
version = "0.0.34"
44
description = "Native Sparse Attention"
55
authors = [
66
{ name = "Phil Wang", email = "[email protected]" }

0 commit comments

Comments
 (0)