A Neovim plugin for AI-assisted code editing using Ollama. Cursor-style cmd+k experience in your terminal.
- Edit Mode (
<leader>k): Select code, describe changes, preview diff, apply - Chat Mode (
<leader>K): Multi-turn conversation about selected code with/editcommand to apply changes - Model Switcher (
<leader>M): Quick switch between configured models
- Neovim 0.10+ (uses
vim.systemfor async HTTP) - Ollama running locally (
ollama serve) - A model available (e.g.,
qwen3-coder:480b-cloudorglm-4.7:cloud)
{
"ParthSareen/vimollama",
config = function()
vim.g.ollama_model = "qwen3-coder:480b-cloud" -- required
end,
keys = {
{ "<leader>k", mode = "v", desc = "Ollama Edit" },
{ "<leader>K", mode = "v", desc = "Ollama Chat" },
{ "<leader>M", desc = "Ollama Model" },
},
}Plug 'ParthSareen/vimollama'
let g:ollama_model = 'qwen3-coder:480b-cloud'Clone to your Neovim packages directory:
git clone https://github.com/ParthSareen/vimollama ~/.local/share/nvim/site/pack/plugins/start/ollama-vim- Select code in visual mode
- Press
<leader>k - Type your edit instruction (e.g., "add error handling")
- Review the diff preview
- Press
Enteroryto apply,Escorqto cancel
- Select code in visual mode
- Press
<leader>K - Ask questions about the code
- Continue the conversation (history is maintained)
- Type
/edit <instruction>to modify the code - Review and apply changes
- Press
Escorqto close chat
-- Required: specify your Ollama model
vim.g.ollama_model = "qwen3-coder:480b-cloud"
-- Optional: models for the switcher (defaults shown)
vim.g.ollama_models = { "qwen3-coder:480b-cloud", "glm-4.7:cloud" }
-- Optional: customize keymaps (defaults shown)
vim.g.ollama_keymap = "<leader>k" -- edit mode
vim.g.ollama_chat_keymap = "<leader>K" -- chat mode
vim.g.ollama_model_keymap = "<leader>M" -- model switcher
-- Optional: custom Ollama endpoint (default: localhost:11434)
vim.g.ollama_endpoint = "http://localhost:11434/api/generate"
vim.g.ollama_chat_endpoint = "http://localhost:11434/api/chat"
-- Optional: custom system prompts
vim.g.ollama_system_prompt = "Your custom edit prompt..."
vim.g.ollama_chat_system_prompt = "Your custom chat prompt..."
vim.g.ollama_chat_edit_system_prompt = "Your custom chat-edit prompt...":OllamaEdit- Start edit mode (visual mode):OllamaChat- Start chat mode (visual mode):OllamaModel- Open model switcher
Customize the appearance by setting these highlight groups:
vim.api.nvim_set_hl(0, "OllamaPreviewAdd", { fg = "#a6e3a1" }) -- added lines
vim.api.nvim_set_hl(0, "OllamaPreviewDel", { fg = "#f38ba8" }) -- deleted lines
vim.api.nvim_set_hl(0, "OllamaPreviewHeader", { fg = "#89b4fa" }) -- diff headers
vim.api.nvim_set_hl(0, "OllamaChatCode", { fg = "#6c7086" }) -- code context
vim.api.nvim_set_hl(0, "OllamaChatUser", { fg = "#89b4fa" }) -- user messages
vim.api.nvim_set_hl(0, "OllamaChatAssistant", { fg = "#a6e3a1" }) -- assistant messagesIf you prefer to set up your own keymaps, disable the defaults and use the <Plug> mappings:
vim.g.ollama_no_maps = true -- disable default keymaps
-- Then map manually:
vim.keymap.set("x", "<leader>k", "<Plug>(ollama-edit)")
vim.keymap.set("x", "<leader>K", "<Plug>(ollama-chat)")
vim.keymap.set("n", "<leader>M", "<Plug>(ollama-model)")Available <Plug> mappings:
<Plug>(ollama-edit)- Edit mode (visual)<Plug>(ollama-chat)- Chat mode (visual)<Plug>(ollama-model)- Model switcher (normal)
MIT