LLM Catalog : custom models yml file, activate or not some llm providers, using ALIAS everywhere instead of model. #1436
GuillaumeEttoriKeyrus
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I would like to be able to create my own catalog files, like you did with openai-models.yml. for exemple:
azure-openai-models.yml
And i would like to be able to use ALIAS instead of true model name everywhere.
That means if i want to mix LLM from BOTH OPENAI & AZURE OPEN AI and two of them got actually the same name like gpt-5-nano deployed with this name on azure, and i have the same on openai, i would like to be able to alias them in both catalog, for exemple azure-gpt-5-nano and openai-gpt5-nano and be able to choose on my yml which alias i want to use for embedding, role, llm etc.. etc.. With this way, we can easily create a embabel project with all possible connectors to different llm providers, include our own catalog because each apikey or each provider can access ONLY to certain LLM and not always all. So if we could say in pom.xml OR in application.yml, embabel.catalog.openai.activate=true and the ability to specify if we want to overwrite, the default catalog models files, it will be very great. Keeping the existing XXX-models.yml files is very nice because there is already tons of information inside like pricing, token, cutoff, etc.. etc.. per default, we should always use ALIAS to get an embedded or llm model, the "model or deployement name" should be use as a backoff OR if a properties if true or false like embabel.usealias=true (or false if people want to use only model or deployement name).
Best regards,
Beta Was this translation helpful? Give feedback.
All reactions