Skip to content

[Recomendation] Support for local AI models (no external dependencies) #17

@2nkn0w

Description

@2nkn0w

Hi 👋

First of all, thanks for the great work on this project.

I’d like to suggest a feature that could be very valuable for many users: adding support for running local AI models instead of relying solely on external APIs (such as Google or other providers).

This would allow:

  • 100% free usage, with no API-related costs
  • Improved privacy, since no data would leave the user’s server or machine
  • The ability to run in offline or restricted-network environments
  • More flexibility for advanced users already working with local models

Some possible implementation ideas:

  • Integration with tools like Ollama, LocalAI, or similar
  • Ability to configure custom endpoints compatible with the OpenAI API
  • Option to switch between external providers and local models via settings

I understand this could require significant changes, but it could add a lot of value to the project and broaden its adoption.

Thanks for considering it!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions