Hi,
I’m trying to integrate a self-hosted LLM with the extension, which uses vLLM (supports the OpenAI API standard). I've configured the HTTP endpoint and model name, but I'm unsure where to include the API key.
I tried to add headers to the json settings, but I'm facing the following error when the code completion request is sent:
serde json error: data did not match any variant of untagged enum OpenAIAPIResponse
Could you guide me on the correct approach for passing the API key?
Thanks!