You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Specifies the maximum number of tokens that can be generated in the chat completion.
666
-
667
-
The total number of tokens — including both input and output — must not exceed the model's context length.
665
+
The maximum number of tokens that can be generated in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length. Set to 0 for the model's configured max generated tokens.
Specifies the maximum number of tokens that can be generated in the chat completion.
2184
-
2185
-
The total number of tokens — including both input and output — must not exceed the model's context length.
2181
+
The maximum number of tokens that can be generated in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length. Set to 0 for the model's configured max generated tokens.
Specifies the maximum number of tokens that can be generated in the chat completion.
666
-
667
-
The total number of tokens — including both input and output — must not exceed the model's context length.
665
+
The maximum number of tokens that can be generated in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length. Set to 0 for the model's configured max generated tokens.
Specifies the maximum number of tokens that can be generated in the chat completion.
2184
-
2185
-
The total number of tokens — including both input and output — must not exceed the model's context length.
2181
+
The maximum number of tokens that can be generated in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length. Set to 0 for the model's configured max generated tokens.
0 commit comments