Would be nice to make the plugin compatible with Ollama, so you could run everything in local (ollama + llama3.2 3b for example)