Blog

๐Ÿš€ Claude Code is now effectively free to run locally.

31-03-2026 ยท in development

With Ollama now supporting Anthropicโ€™s Messages API, you can use Claude Code with local openโ€‘source models โ€” no API keys, no token costs, and no data leaving your machine.

That removes two big blockers at once:
๐Ÿ‘‰ privacy concerns
๐Ÿ‘‰ usage costs

The setup is surprisingly quick (under 5 minutes):

๐‡๐จ๐ฐ ๐ญ๐จ ๐ ๐ž๐ญ ๐ฌ๐ญ๐š๐ซ๐ญ๐ž๐:

1. Install Ollama
2. Pull a model:
ollama pull qwen2.5-coder
3. Install Claude Code using Anthropicโ€™s official script
4. Point Claude Code to your local Ollama server:
export ANTHROPIC_AUTH_TOKEN=ollama 
export ANTHROPIC_BASE_URL=http://localhost:11434 
5. Run:
claude --model qwen2.5-coder

Youโ€™re not limited to this model โ€” any Ollama model with tool support can be used. It also works with Ollama Cloud if you donโ€™t want to run everything locally.
This makes it much easier to explore Claude Code in isolation and test how well it performs with different models โ€” privately, cheaply, and without friction.
Perfect playground for anyone experimenting with agentic workflows. ๐Ÿง