Skip to content

Commit e582786

Browse files
committed
last last fix for the day..
1 parent 10bcfb9 commit e582786

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed

native_sparse_attention_pytorch/transformer.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -164,6 +164,7 @@ def __init__(
164164
layers.append(ModuleList([attn, ff]))
165165

166166
self.attn_sliding_window_size = getattr(attn, 'sliding_window_size', None)
167+
self.attn_fine_block_size = getattr(attn, 'selection_block_size', None)
167168

168169
self.layers = ModuleList(layers)
169170

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "native-sparse-attention-pytorch"
3-
version = "0.0.25"
3+
version = "0.0.26"
44
description = "Native Sparse Attention"
55
authors = [
66
{ name = "Phil Wang", email = "[email protected]" }

0 commit comments

Comments
 (0)