Anvil
Local desktop AI assistant for Windows. Chat with your own Ollama models across purpose-built modes — brainstorm, code, research, or let it auto-route. Runs 100% offline, free, no API keys.
Anvil is a local desktop AI assistant for Windows. It gives you a single chat window with a mode toggle — pick Ideas to brainstorm, Coding to write or fix code, General for everyday help, or leave it on Auto and let a tiny router model pick the right mode for whatever you type.
Each mode swaps three things under the hood: the system prompt (so the assistant’s personality and focus change), the Ollama model (so coding tasks use qwen2.5-coder while brainstorming uses a smaller general model), and the tool set (so Ideas mode is pure chat but Coding mode can read, write, and run things in a sandboxed workspace).
Actions are workspace-scoped: you pick a folder at session start, reads and writes inside it run automatically, and anything destructive or outside the workspace prompts for confirmation. When you ask it to build something new, the code-reuse scanner searches your existing LocalProjects folder via a MemPalace semantic index and offers to copy relevant snippets into the new project — copies, never reads — so your source projects are never touched at runtime.
Everything runs on your own machine. No API keys, no cloud, no subscription. The only dependency beyond Python is Ollama, which you’ve probably already got if you’re here.
Features
- Four modes — Ideas, Coding, General, Auto — each swaps system prompt, model, and enabled tools
- Auto mode uses a tiny router model to pick the best mode from your first message
- Workspace-scoped file, shell, and web tools with destructive-action confirmation
- Code-reuse scanner — searches existing LocalProjects via MemPalace, copies snippets into new projects with attribution
- Streams tokens live, ends every task with a clean stand-alone summary
- Runs entirely on local Ollama — no API keys, no cloud, no subscription