Skip to content

Commit 30d7ce1

Browse files
committed
update
1 parent a93b8fe commit 30d7ce1

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

articles/ai-services/openai/faq.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,9 @@ sections:
113113
How do I fix Server error (500): Unexpected special token
114114
answer: |
115115
This is a a known issue. You can minimize the occurrence of these errors by reducing the temperature of your prompts to less than 1 and ensuring you're using a client with retry logic. Reattempting the request often results in a successful response.
116+
116117
If reducing temperature to less than 1 does not reduce the frequency of this error an alternative workaround is set presence/frequency penalties and logit biases to their default values. In some cases, it may help to set `top_p` to a non-default, lower value to encourage the model to avoid sampling tokens with lower probability tokens.
118+
117119
Another way to solve this issue is to add logit_bias in the payload. For example, if error is `Unexpected special token: 100266`, then we can add "logit_bias": {100266:-100} to reduce the issue by using token 100266 mentioned in the service side error message.
118120
- question: |
119121
We noticed charges associated with API calls that failed to complete with status code 400. Why are failed API calls generating a charge?

0 commit comments

Comments
 (0)