Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Commit 59ce657

Browse files
author
anirudh
committed
default batch size 1
1 parent 8900f8a commit 59ce657

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torchchat/usages/eval.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -212,7 +212,7 @@ def __init__(
212212
*,
213213
device: torch.device,
214214
max_seq_length: int = 4096,
215-
batch_size: int = 8,
215+
batch_size: int = 1,
216216
dtype: torch.dtype = torch.bfloat16,
217217
enable_kv_cache: bool = True,
218218
# TODO (@joecummings): Update these defaults once more multimodal

0 commit comments

Comments
 (0)