Skip to content

Commit 04b74fa

Browse files
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
1 parent c7cbcd9 commit 04b74fa

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

litgpt/attention.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,6 @@
1616
)
1717
from litgpt.config import Config
1818

19-
2019
# Currently, `torch.nn.functional.scaled_dot_product_attention` does not
2120
# properly support the case `enabla_gqa=True` (i.e., keys and values have
2221
# less heads than queries). In this case, it is best to extend keys and

0 commit comments

Comments
 (0)