Will local AI models work be implemented? #5413
Replies: 2 comments
-
Thanks for the question @Necrodeather Questions are more for the discussion tab, converting to a discussion 😄 |
Beta Was this translation helpful? Give feedback.
-
Google closed all discussions and pull requests recently for multi-model support. I think this is the right decision. Gemini-cli will be focused on being as good as it can be for Gemini. The Gemini Code Assist team working on this tool will put their resources into the ide support and every aspect of making the CLI better. Downstream you have options:
It's difficult for a company like Google to pursue its own roadmap and aims to incorporate community feedback and direction. Luckily github and LLMs make it possible to do up-to-date community forks! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Will the function of working with local AI models be implemented (if I want to use ollama or lm studio) on my PC or Server?
Beta Was this translation helpful? Give feedback.
All reactions