EvoLog-AI is your AI-powered VS Code extension that turbocharges your development workflow using Ollama. It automatically generates professional commit messages so you can focus on coding. 🤖
- Smart Commit Crafting: Generate meaningful commit messages using Ollama AI instead of writing them manually ✍️
- Conventional Commits: AI automatically categorizes commits using standard prefixes (feat, fix, docs, etc.) 🏷️
- Context-Aware: Analyzes your actual code changes (staged or unstaged) to create relevant messages 🔍
- Local Processing: All AI processing happens locally on your machine for maximum privacy 🛡️
- Automated Changelog Generation: AI will analyze your commit history to create comprehensive changelogs 📊
- Smart Categorization: Auto-detects changes such as Added, Refactored, Modified, Deleted, and more 📈
- Install Ollama: Get Ollama running on your machine 💻
- Setup Model: Run
ollama run mistral-large-3:675b-cloud(or your preferred model) 🤖 - Open Project: Open your Git repository in VS Code 📂
- Generate Commit Messages: Click the lightbulb icon in the Source Control view title bar and select Generate Commit Message 💬
You can configure the extension in VS Code settings (Settings > Extensions > EvoLog-AI):
evolog-ai.ollamaHost: Set the Ollama API endpoint (default:http://localhost:11434) 🌐evolog-ai.ollamaModel: Choose your preferred AI model (default:mistral-large-3:675b-cloud) 🧠evolog-ai.enabled: Enable or disable the extension (default:true) ✅
- Open the Source Control view (
Ctrl+Shift+G) 📁 - (Optional) Stage the changes you want to include. If no changes are staged, EvoLog will analyze unstaged changes.
- Click the EvoLog-AI (lightbulb) icon in the Source Control title bar.
- Select EvoLog-AI: Generate Commit Message 🤖
- Review the generated message in the commit input box and commit ✅
EvoLog-AI provides a convenient settings view directly in the Source Control sidebar where you can quickly:
- View and edit the current Ollama Host.
- Switch between different AI models.
EvoLog-AI is privacy-focused. All AI processing happens locally using Ollama—your code never leaves your machine. 🛡️
Built with ❤️ for developers who want to spend more time coding and less time writing docs. 🎉
- Install dependencies:
npm install - Compile TypeScript:
npm run compile - Run tests:
npm test
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
- Default Ollama host is defined in
src/lib/utility.ts:4ashttp://localhost:11434. - Default model is defined in
src/lib/utility.ts:5asmistral-large-3:675b-cloud. - Commit message generation uses
handleGenerateCommitMessagewhich gathers git changes via the VS Code Git extension. - Settings UI is provided by
EvoLogAISettingsProviderand registered insrc/extension.ts:9.