Hi 👋
First of all, thanks for the great work on this project.
I’d like to suggest a feature that could be very valuable for many users: adding support for running local AI models instead of relying solely on external APIs (such as Google or other providers).
This would allow:
- 100% free usage, with no API-related costs
- Improved privacy, since no data would leave the user’s server or machine
- The ability to run in offline or restricted-network environments
- More flexibility for advanced users already working with local models
Some possible implementation ideas:
- Integration with tools like Ollama, LocalAI, or similar
- Ability to configure custom endpoints compatible with the OpenAI API
- Option to switch between external providers and local models via settings
I understand this could require significant changes, but it could add a lot of value to the project and broaden its adoption.
Thanks for considering it!
Hi 👋
First of all, thanks for the great work on this project.
I’d like to suggest a feature that could be very valuable for many users: adding support for running local AI models instead of relying solely on external APIs (such as Google or other providers).
This would allow:
Some possible implementation ideas:
I understand this could require significant changes, but it could add a lot of value to the project and broaden its adoption.
Thanks for considering it!