๐ Claude Code is now effectively free to run locally.
31-03-2026 ยท in development
With Ollama now supporting Anthropicโs Messages API, you can use Claude Code with local openโsource models โ no API keys, no token costs, and no data leaving your machine.
That removes two big blockers at once:
๐ privacy concerns
๐ usage costs
The setup is surprisingly quick (under 5 minutes):
๐๐จ๐ฐ ๐ญ๐จ ๐ ๐๐ญ ๐ฌ๐ญ๐๐ซ๐ญ๐๐:
1. Install Ollama
2. Pull a model:
ollama pull qwen2.5-coder
3. Install Claude Code using Anthropicโs official script
4. Point Claude Code to your local Ollama server:
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://localhost:11434
5. Run:
claude --model qwen2.5-coder
Youโre not limited to this model โ any Ollama model with tool support can be used. It also works with Ollama Cloud if you donโt want to run everything locally.
This makes it much easier to explore Claude Code in isolation and test how well it performs with different models โ privately, cheaply, and without friction.
Perfect playground for anyone experimenting with agentic workflows. ๐ง