Skip to content

Commit 4b3f2e4

Browse files
authored
[BE] remove unused value "max_seq_len" (#585)
Stack from [ghstack](https://github.com/ezyang/ghstack) (oldest at bottom): * __->__ #585
1 parent d2a4904 commit 4b3f2e4

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

torchtitan/models/llama/model.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,6 @@ class ModelArgs:
2929
norm_eps: float = 1e-5
3030
rope_theta: float = 10000
3131

32-
max_batch_size: int = 32
3332
max_seq_len: int = 2048
3433
# If `True`, then each transformer block init uses its layer ID, and if
3534
# `False`, each uses the total number of transformer blocks

0 commit comments

Comments
 (0)