Tools: Running Claude Code with Ollama models (Local / Cloud)
Run Claude Code with Ollama (Local, Cloud, or Any Model)
Prerequisites
Install Ollama
Windows (PowerShell)
macOS / Linux
Install Claude Code
Windows (PowerShell)
macOS / Linux
Running Claude Code with Ollama
Option 1: Launch and Select a Model
Option 2: Launch with a Specific Model
Grant Folder Access
Official Documentation This guide shows how to run Claude Code using Ollama, allowing you to use local models, cloud models, or any Ollama-supported model directly from your terminal. Make sure the following tools are installed: If Ollama is not installed, you can install it using the commands below. You can also follow this guide:
https://dev.to/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d Once both tools are installed, you can start Claude Code through Ollama. The commands work the same on Windows, macOS, and Linux. This will open a model selection menu where you can choose a model using the arrow keys. You can also specify the model directly. Replace the model name with any model available in your Ollama environment. When Claude Code starts, it will ask permission to access the current folder. Select Yes to allow Claude to read and modify files in the directory. Claude Code will now start and connect to the selected model. You can start interacting with your codebase immediately. For more details, see the official docs: https://docs.ollama.com/integrations/claude-code Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse