Skip to content

how to check or debug the actual model used? #477

@ssoulless

Description

@ssoulless

I am trying to configure custom commands I can use to switch between ChatGPT-4 and 3.5-turbo with the following lua config file:

require("chatgpt").setup({
    api_key_cmd = "cat /Users/svelandiag/.openai_api_key",
    -- Configura el modelo dinámicamente para cada solicitud
    model = function()
        return vim.g.chatgpt_use_gpt4 and "gpt-4" or "gpt-3.5-turbo"
    end,
})

-- Inicializa la variable global
vim.g.chatgpt_use_gpt4 = false -- Por defecto, usa GPT-3.5-turbo

-- Comandos personalizados
vim.api.nvim_create_user_command("UseGPT4", function()
    vim.g.chatgpt_use_gpt4 = true
    print("ChatGPT.nvim: Switched to GPT-4")
end, {})

vim.api.nvim_create_user_command("UseGPT35", function()
    vim.g.chatgpt_use_gpt4 = false
    print("ChatGPT.nvim: Switched to GPT-3.5-turbo")
end, {})

vim.api.nvim_create_user_command("CheckModel", function()
    local model = vim.g.chatgpt_use_gpt4 and "gpt-4" or "gpt-3.5-turbo"
    print("ChatGPT.nvim: Current model is " .. model)
end, {})

When I run UseGPT35 and then run CheckModel it says Im using 3.5, then I open the chat prompt and ask ChatGPT which model Im talking to and it always answers that Im talking with model GPT-4

I tried many things like reloading the plugin programmatically after switching models, at the end I just manually specified 3.5-turbo in the model and then reloaded the configs and opened the chat prompt again, and it chatGPT still said I was talking to GPT-4

Then I told it that I was specifying 3.5-turbo in the configs and it then answered this:

My apologies for the previous confusion. If you are using the "gpt-3.5-turbo" model in the ChatGPT.nvim plugin, then you are interacting with the GPT-3.5 language model optimized for faster performance. I appreciate the correction and apologize for the earlier misinformation. If you have any other questions or need further assistance, I'll be here to help you!

Im not sure how to debug if the requests are being made correctly to the model I specified using the custom commands I configured, could not find anything about how to check any logs for the requests being made to the Chatgpt api

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions