Skip to content

11. Profiles

tetherscript edited this page Feb 16, 2025 · 14 revisions

A profile is a settings file that defines the type of translation service and the settings needed to use that service. You can create your own profile or modify an existing one. You can also code your own translation service type.

For AI services like OpenAI or LM Studio running locally, the host API is very similar, but there are differences, and further differences between the different models used. Some can do structured output, which is preferred because it is the most reliable. And some can provide reasoning. And some modes are just better than others. A estimated 1.7 trillion parameter OpenAI model will generally give more accurate and consistent results a 1 billion parameter model for sure. For a commercial app for lots of languages, I would just get an OpenAI API account and pay the $5 for credits, that should be enough. But these local models will get rapidly better, so local will be the way to go in the future.

A lot of prompt testing will be needed, and you ca use the Profile Editor for this.

Testing

This shows the Profile Test tab for the OpenAI_SO_gpt-4o-mini-2024-07-18 profile. A test that repeats 3 de-DE translations was attempted. Using repeats allows you to assess how deterministic (consistent) the translation is. If it gives the same result for at least 10 attempts, that should be good enough. Some prompts can be non-deterministic giving seemingly random responses, so if you are creating a new profile, perhaps for a different model, you will need to tweak the prompts in the Profile Settings, save, then test again. Or you could do the prompt test in a different app or Python to figure out which prompts work best.

image

Settings

image

Coding Your Own Type

You can clone a profile and edit it to use a different model, prompts etc. But if you want to make a different 'type', you need to write some code.

To code your own translation type, you only need to touch:

  • Edit \TranslatorEx\Translate.cs - add the name of your Translation Function here.
  • Add \TranslatorEx\Types\your-type.cs

image

See the Loopback profile for the simplest example. It has no settings .prf file and only adds a $ prefix as the translated text.

Tip

If you are using AI, most of your time will be spent tweaking the prompts. Sometimes trying to get it to follow your rules and return only the translated text can require a lot of experimentation. Aim for the maximum determinism possible.

Confidence

You'll notice there is a min-confidence entry in the .prf. For good models that can reliably return a confidence level, you can set the minimum, and anything lower than the minimum will cause a <!> prefix to be added to the translation result. This may mean that it's a hopeless translation that will never work - but you can always search for these in the cache editor and fix them with your own translation from Google Translate maybe. But it's there to tell you there was a problem.

Some models just can't report the confidence consistently, like local deepseek with structured output. I'm sure there's a magical prompt that can make it work, but for now, I just set the min-confidence to 0.

Clone this wiki locally