Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@nattsw
Copy link
Contributor

@nattsw nattsw commented Jul 17, 2025

We're seeing that some LLMs are using 65000+ tokens for raw text that is only 10-1000 characters long.

This PR adds a max_token to be passed to the LLM API for each translation based on the length of the text.

@nattsw nattsw force-pushed the max-token-limit-on-raw-length branch from 0fa3b42 to 2cd6a58 Compare July 17, 2025 06:53
@nattsw nattsw merged commit 5d80a34 into main Jul 17, 2025
6 checks passed
@nattsw nattsw deleted the max-token-limit-on-raw-length branch July 17, 2025 09:47
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants