Model Mismatch for Custom Endpoint #5583
-
So I want to use Portkey due to the easiness for monitoring. But I got something strange:
I'm running LibreChat via docker with tutorial for local installation, although I ran it on my Debian server. The changes is on librechat.yaml, and .env for API Keys, and list of endpoint. Please help me find the problem here. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
I don't see where you're seeing that OpenAI is being called? Can you share the debug logs found in ./logs at project root? |
Beta Was this translation helpful? Give feedback.
-
Duplicate of #4927 |
Beta Was this translation helpful? Give feedback.
Thanks for sharing, we can surmise a lot from this.
This shows that the request is correctly going to your portkey endpoint for each request being made including the title.
The "OpenAI" verbiage is just for the relevant backend part being used, since custom endpoints share functionality with OpenAI due to being "OpenAI-like" or OpenAI API-compatible.
The high tokens come from the default artifacts prompt, which is indeed lengthy and you likely have the feature enabled.
Lastly, the title may be failing as Google may not like a single System message being sent as part of the chat history here: