If you're running an OpenAI API-compatible LLM server locally can you point this plugin to your local URL (vs using ChatGPT)?