Model Specs Settings a little buggy #3187
Replies: 3 comments 10 replies
-
Beta Was this translation helpful? Give feedback.
-
I've noticed issues with it too. I think we need a 'none' option for the model spec list. To be honest, this feature doesn't really do what I expected. I though it was the equivalent to 'avatars'/'personas'/'characters' etc from similar apps, and it kind of is, but it's less flexible, as it seems to restrict the ability to just use the base endpoints like normal. I feel like this feature should be 'opened up' more, to also act as an easy way to switch from the 'general' AIs, like standard GPT-4 etc, to ones that are a bit more 'specialised' (since they have a specific prompt prefix to try and refine their focus with no input needed from the user). |
Beta Was this translation helpful? Give feedback.
-
In Version 0.7.4 the models specs are not working anymore. The select dropdown is not visible anymore. I guess modelSpecs are a really important feature. Of cause there are default presets, but If there is a team using the solution together and not everyone is really familiar on how to set model specs like temperature and system promt, the results are usually quite bad. Having a central place to manage presets for all users, that are working just like modelSpecs is really nice. Please advice @danny-avila |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I noticed when using Model Specs https://www.librechat.ai/docs/configuration/librechat_yaml/object_structure/model_specs, that when I have different endpoints in several specs like openai, google, gptPlugins and I change to a spec that has the plugins endpoint and then go back to the one that has a openai or google endpoint, the endpoint does not change and stays on plugins.
Also changing
prioritize: true
did not help. When having set that its the same.Steps to Reproduce
you will see that the endpoint does not change back, after the settings you have set in the modelSpecs but stays on the gptPlugins endpoint.
The only way to get it working is to manually change the endpoint.
Worse it gets when you have set
prioritize: true
in one of your modelSpecs presets then the only way to change the endpoint is to go to one chat in your history wich has that endpoint as somehow the endpoint selector does not work anymore.Hopefully its clear, if there are any questions I am happy to help.
What browsers are you seeing the problem on?
No response
Relevant log output
No response
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions