-
Notifications
You must be signed in to change notification settings - Fork 361
Open
Description
I was testing out this library using gpt-4o, and eventually hit the following error after ~10min:
openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 391407 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
Do I need to use a different model? Or is it possible to update the library to truncate its own messages to fit within the chosen model's context length limits?
Metadata
Metadata
Assignees
Labels
No labels