Skip to content

Commit b61209d

Browse files
committed
typo
1 parent 16506a0 commit b61209d

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

articles/api-management/llm-token-limit-policy.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,6 @@ By relying on token usage metrics returned from the LLM endpoint, the policy can
5959

6060
### Usage notes
6161

62-
*
6362
* This policy can be used multiple times per policy definition.
6463
* Where available when `estimate-prompt-tokens` is set to `false`, values in the usage section of the response from the LLM API are used to determine token usage.
6564
* Certain LLM endpoints support streaming of responses. When `stream` is set to `true` in the API request to enable streaming, prompt tokens are always estimated, regardless of the value of the `estimate-prompt-tokens` attribute.

0 commit comments

Comments
 (0)