CMD LOCAL AI v1.0.0 – First Public Release
CMD LOCAL AI v1.1.0
What’s new
- Added improved terminal chat flow with
AI>prompt. - Added spinner during response generation.
- Added response stats line with token count and generation time.
- Improved output filtering to show final answer only.
- Added safer handling for model loading retries and timeouts.
- Updated docs (
README.md,index.html) with Polish and English sections.
Highlights
- Local-first GGUF model runner (
llama-cpp-python) - CLI + Ollama-compatible API (
/tags,/generate,/chat,/pull,/version) - Cleaner chat UX and more predictable answers
More updates coming soon.