-
-
Notifications
You must be signed in to change notification settings - Fork 337
Open
Description
Im running a custom LLM provider that uses the same API as openai.
I've set the env vars OPENAI_API_HOST and OPENAI_API_KEY and set my config as:
openai_params = {
model = "my-custom-model",
frequency_penalty = 0,
presence_penalty = 0,
max_tokens = 4095,
temperature = 0.5,
top_p = 0.5,
n = 1,
}
After that I can open the chatgpt window but when i issue a message the TUI just has the "thinking dots". Is there a way to get more debug info?
I have successfully use this API with other openai-compatible clients, so i think it might be a matter with playing with configs. but can't tell how to enable debugging for this plugin
Metadata
Metadata
Assignees
Labels
No labels