Skip to content

Commit eb8f146

Browse files
authored
Merge pull request #162 from SamiKalliomaki/fix-docs
Fix documentation for max_new_tokens.
2 parents fc22734 + 24ced1a commit eb8f146

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

training/generate.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ def __init__(
7070
7171
Args:
7272
do_sample (bool, optional): Whether or not to use sampling. Defaults to True.
73-
max_new_tokens (int, optional): Max new tokens after the prompt to generate. Defaults to 128.
73+
max_new_tokens (int, optional): Max new tokens after the prompt to generate. Defaults to 256.
7474
top_p (float, optional): If set to float < 1, only the smallest set of most probable tokens with
7575
probabilities that add up to top_p or higher are kept for generation. Defaults to 0.92.
7676
top_k (int, optional): The number of highest probability vocabulary tokens to keep for top-k-filtering.

0 commit comments

Comments
 (0)