Skip to content

A NeoVim plugin to be able to chat with local models using ollama cli in a neovim buffer

License

Notifications You must be signed in to change notification settings

Dheeraj-Murthy/Ollama_chat.nvim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama_chat.nvim

Chat with local LLMs (like DeepSeek Coder) directly inside Neovim using an interactive markdown buffer and the Ollama CLI.

The chat lives inside a Markdown file, so code snippets and text formatting work naturally with tools like marksman or markdown-preview.

Perfect for in-editor AI coding assistance, quick explanations, or note-taking with models like deepseek-coder-v2, llama3, codellama, etc.


Features

  • Interactive chat with Ollama in a Markdown buffer
  • Auto-inserts ### User / ### Assistant blocks for clarity
  • Live streaming responses, updated inline
  • Cleans up ANSI escape sequences from output
  • Minimal, no external dependencies

Installation

With lazy.nvim

{
  "Dheeraj-Murthy/Ollama_chat.nvim",
  config = function()
    require("ollama_chat").setup({
      model = "deepseek-coder-v2", -- optional, defaults to this
    })
  end
}

With packer.nvim

use {
  "Dheeraj-Murthy/Ollama_chat.nvim",
  config = function()
    require("ollama_chat").setup()
  end
}

Usage

  1. Run :OllamaChat to open or jump to the chat buffer (OllamaChat.md).
  2. Type your message under the ### User block.
  3. Press <Enter> in normal mode to send it.
  4. The model’s response will stream back under a new ### Assistant block.

Requirements

  • Ollama installed (ollama run ... must work in your terminal)
  • An Ollama-compatible model (e.g. deepseek-coder-v2, codellama, llama3)

Example to get started:

ollama run deepseek-coder-v2

Preview

OllamaChat output

▶️ Demo video: media/demo-fast.mp4


Roadmap

  • Session history / persistence
  • Model switching from inside Neovim
  • Telescope integration to browse past sessions
  • Richer Markdown formatting for roles

Author

M. S. Dheeraj Murthy
GitHub · LinkedIn


Contributing

Contributions are welcome — issues, PRs, and ideas!
If you build something cool on top (prompt templates, chaining commands, Telescope pickers, etc.), feel free to share it.

About

A NeoVim plugin to be able to chat with local models using ollama cli in a neovim buffer

Topics

Resources

License

Stars

Watchers

Forks

Languages