Skip to content

Commit 3ad4399

Browse files
Mention max_completion_tokens in documentation (openvinotoolkit#3214)
1 parent c80f736 commit 3ad4399

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/model_server_rest_api_chat.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -97,7 +97,7 @@ curl http://localhost/v3/chat/completions \
9797
| stream_options |||| object (optional) | Options for streaming response. Only set this when you set stream: true |
9898
| stream_options.include_usage |||| bool (optional) | Streaming option. If set, an additional chunk will be streamed before the data: [DONE] message. The usage field in this chunk shows the token usage statistics for the entire request, and the choices field will always be an empty array. All other chunks will also include a usage field, but with a null value. |
9999
| messages |||| array (required) | A list of messages comprising the conversation so far. Each object in the list should contain `role` and `content` - both of string type. [Example Python code](clients_genai.md) |
100-
| max_tokens |||| integer | The maximum number of tokens that can be generated. If not set, the generation will stop once `EOS` token is generated. If max_tokens_limit is set in graph.pbtxt it will be default value of max_tokens. |
100+
| max_tokens / max_completion_tokens |||| integer (optional) | The maximum number of tokens that can be generated. If not set, the generation will stop once `EOS` token is generated. If max_tokens_limit is set in graph.pbtxt it will be default value of max_tokens. |
101101
| ignore_eos |||| bool (default: `false`) | Whether to ignore the `EOS` token and continue generating tokens after the `EOS` token is generated. |
102102
| include_stop_str_in_output |||| bool (default: `false` if `stream=false`, `true` if `stream=true`) | Whether to include matched stop string in output. Setting it to false when `stream=true` is invalid configuration and will result in error. |
103103
| logprobs | ⚠️ ||| bool (default: `false`) | Include the log probabilities on the logprob of the returned output token. **_ in stream mode logprobs are not returned. Only info about selected tokens is returned _** |

0 commit comments

Comments
 (0)