Tools: Using Ollama with VS Code for Local AI-Assisted Development

Tools: Using Ollama with VS Code for Local AI-Assisted Development

Source: Dev.to

Using Ollama with VS Code ## 1. Install Ollama & Run A Model ## 2. Install A VS Code Extension ## 3. Configure VS Code Extension to Use Ollama ## 4. Start Using Local AI! ## Conclusion If you want an AI coding assistant without sending your code to the cloud, Ollama makes it easy to run an LLM locally and easily integrates with Visual Studio Code amongst other IDEs. Download and install Ollama from the official site. After installing, verify it works via your terminal: Then select a model to run. You will need to choose a model, version and number of parameters based on your hardware (RAM/CPU/GPU specs). You may also need to limit the context length for the best performance. One of the models I tested for coding tasks was qwen3-coder:7b: The model will download automatically when you run it. You can select & test this model out via the Ollama GUI. To use Ollama inside VS Code, install an extension that supports it - a popular option is Continue. Open the Continue config file - it should be located in your profile: ~/.continue/config.json Then add your local Ollama model(s): Restart VS Code after saving. You can now ask questions or give instructions directly in your IDE, such as: All requests are handled locally through Ollama, using the model you previously setup Using Ollama with VS Code gives you a private, offline AI coding assistant that can act as an alternative to ChatGPT, Claude and other popular AI tooling, especially if an AI provider has downtime, you run out of credits or you want a privacy focused, offline AI assistant! Ollama is simple to set up, free to run and works with many open-source models. Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse CODE_BLOCK: ollama --version Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: ollama --version CODE_BLOCK: ollama --version CODE_BLOCK: ollama run qwen3-coder Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: ollama run qwen3-coder CODE_BLOCK: ollama run qwen3-coder CODE_BLOCK: { "models": [ { "title": "My Llama Model", "provider": "ollama", "model": "qwen3-coder:7b" } ] } Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: { "models": [ { "title": "My Llama Model", "provider": "ollama", "model": "qwen3-coder:7b" } ] } CODE_BLOCK: { "models": [ { "title": "My Llama Model", "provider": "ollama", "model": "qwen3-coder:7b" } ] } - Open VS Code - Go to Extensions - Search for Continue - Click Install - "Explain this codebase" - "Add the following feature [..]" - "Write unit tests for the file @UserService.cs"