OpenAI Max Token exceeded when uploading files #8402
Closed
quadriano31
started this conversation in
Help Wanted
Replies: 1 comment
-
Solved. Set CHUNK_SIZE=300 in the .env file. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When trying to upload a file using Agent File Search, I encountered the following error in the logs:
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Requested 488585 tokens, max 300000 tokens per request', 'type': 'max_tokens_per_request'}}
LibreChat logs also show:
Error uploading vectors: File embedding failed.
[/files] Error processing file: File embedding failed.
It seems the file exceeds the token limit accepted by the embedding model. Is there a recommended way to handle large files
rag_api | raise self._make_status_error_from_response(err.response) from None
rag_api | openai.BadRequestError: Error code: 400 - {'error': {'message': 'Requested 488585 tokens, max 300000 tokens per request', 'type': 'max_tokens_per_request', 'param': None, 'code': 'max_tokens_per_request'}}
rag_api |
rag_api | 2025-07-11 10:38:28,754 - root - INFO - Request POST http://rag_api:8000/embed - 200
LibreChat | 2025-07-11 10:38:28 error: Error uploading vectors An error occurred while setting up the request: File embedding failed.
LibreChat | 2025-07-11 10:38:28 error: [/files] Error processing file: File embedding failed.
Beta Was this translation helpful? Give feedback.
All reactions