Support for OpenRouter #272
machinewrapped
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Added OpenRouter as a translation provider. OpenRouter is an aggregator that provides access to a wide variety of models with a single API key. This includes a number of quite capable models that are free to use, e.g.
google/gemini-2.0-flash-exp:free
deepseek/deepseek-chat-v3-0324:free
qwen/qwen3-235b-a22b:free
meta-llama/llama-3.1-405b-instruct:free
Since OpenRouter provide access to so many models the model list is grouped by model family, and by default filtered to only show models from the "Translation" category - though this excludes many models that are actually very good for translation, including most if not all of the free options.
You can choose to let OpenRouter select the model to use automatically, based on criteria you can configure in their dashboard (e.g. which providers to include or exclude, and whether to prioritise price or speed). This is the default setting, but obviously something of a gamble.
If you are installing from source, OpenRouter does not require any additional dependencies and is now the default provider for
llm-subtrans
Beta Was this translation helpful? Give feedback.
All reactions