Tools: 讓 Claude Code 串接 Ollama 使用本地端模型
Source: Dev.to
Ollama v0.14.0 後提供了與 Anthropic API 的相容性,這使得原本串接 Antropic API 的程式碼可以只接改用 Ollama 上的本地或雲端模型,同時也讓 Claude Code 可以改用 Ollama 上的本地或地端模型。 要讓 Claude Code 接上 Ollama,只要設定以下兩個環境變數: 如果要使用的是同一機器上的 Ollama 伺服器,伺服器網址寫 "localhost" 或是 "127.0.0.1" 都可以。如果是要使用其他台機上的 Ollama 伺服器,記得依照這篇遠端使用 ollama 的方法,讓 Ollama 伺服器可以接受遠端連線。 以我自己的環境為例,Ollama 伺服器位於區網上的 192.168.0.150,安裝的模型如下: 除了本機端的模型,你也可以使用 Ollama 雲端的模型,例如以下先拉取所需的雲端模型資訊: 然後讓 Ollama 伺服器登入你的帳號以便取用雲端模型: Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse CODE_BLOCK:
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://<你的 Ollama 伺服器網址>:11434 Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://<你的 Ollama 伺服器網址>:11434 CODE_BLOCK:
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://<你的 Ollama 伺服器網址>:11434 CODE_BLOCK:
$env:ANTHROPIC_AUTH_TOKEN="ollama"
$env:export ANTHROPIC_BASE_URL="http://<你的 Ollama 伺服器網址>:11434" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
$env:ANTHROPIC_AUTH_TOKEN="ollama"
$env:export ANTHROPIC_BASE_URL="http://<你的 Ollama 伺服器網址>:11434" CODE_BLOCK:
$env:ANTHROPIC_AUTH_TOKEN="ollama"
$env:export ANTHROPIC_BASE_URL="http://<你的 Ollama 伺服器網址>:11434" CODE_BLOCK:
➜ ollama list
NAME ID SIZE MODIFIED
gemma3:27b a418f5838eaf 17 GB 3 weeks ago
gemma3:12b-it-qat 5d4fa005e7bb 8.9 GB 2 months ago
gpt-oss:120b a951a23b46a1 65 GB 2 months ago Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
➜ ollama list
NAME ID SIZE MODIFIED
gemma3:27b a418f5838eaf 17 GB 3 weeks ago
gemma3:12b-it-qat 5d4fa005e7bb 8.9 GB 2 months ago
gpt-oss:120b a951a23b46a1 65 GB 2 months ago CODE_BLOCK:
➜ ollama list
NAME ID SIZE MODIFIED
gemma3:27b a418f5838eaf 17 GB 3 weeks ago
gemma3:12b-it-qat 5d4fa005e7bb 8.9 GB 2 months ago
gpt-oss:120b a951a23b46a1 65 GB 2 months ago CODE_BLOCK:
➜ claude --model gpt-oss:120b Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
➜ claude --model gpt-oss:120b CODE_BLOCK:
➜ claude --model gpt-oss:120b COMMAND_BLOCK:
> 你好 ● 你好!有什么我可以帮助你的吗? ✻ Crunched for 56s > 你現在使用哪一種模型? ● 我目前使用的是 gpt-oss:120b 模型。 Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
> 你好 ● 你好!有什么我可以帮助你的吗? ✻ Crunched for 56s > 你現在使用哪一種模型? ● 我目前使用的是 gpt-oss:120b 模型。 COMMAND_BLOCK:
> 你好 ● 你好!有什么我可以帮助你的吗? ✻ Crunched for 56s > 你現在使用哪一種模型? ● 我目前使用的是 gpt-oss:120b 模型。 COMMAND_BLOCK:
Select model Switch between Claude models. Applies to this session and future Claude Code sessions. For other/previous model names, specify with --model. 1. Default (recommended) Use the default model (currently Sonnet 4.5) · $3/$15 per Mtok 2. Opus Opus 4.5 · Most capable for complex work · $5/$25 per Mtok 3. Haiku Haiku 4.5 · Fastest for quick answers · $1/$5 per Mtok > 4. gpt-oss:120b √ Custom model Enter to confirm · escape to exit Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
Select model Switch between Claude models. Applies to this session and future Claude Code sessions. For other/previous model names, specify with --model. 1. Default (recommended) Use the default model (currently Sonnet 4.5) · $3/$15 per Mtok 2. Opus Opus 4.5 · Most capable for complex work · $5/$25 per Mtok 3. Haiku Haiku 4.5 · Fastest for quick answers · $1/$5 per Mtok > 4. gpt-oss:120b √ Custom model Enter to confirm · escape to exit COMMAND_BLOCK:
Select model Switch between Claude models. Applies to this session and future Claude Code sessions. For other/previous model names, specify with --model. 1. Default (recommended) Use the default model (currently Sonnet 4.5) · $3/$15 per Mtok 2. Opus Opus 4.5 · Most capable for complex work · $5/$25 per Mtok 3. Haiku Haiku 4.5 · Fastest for quick answers · $1/$5 per Mtok > 4. gpt-oss:120b √ Custom model Enter to confirm · escape to exit CODE_BLOCK:
➜ ollama pull minimax-m2.1:cloud
pulling manifest pulling 692f7c2439cc: 100% ▕█████████████████████████████████████████████████████████▏ 378 B verifying sha256 digest writing manifest success Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
➜ ollama pull minimax-m2.1:cloud
pulling manifest pulling 692f7c2439cc: 100% ▕█████████████████████████████████████████████████████████▏ 378 B verifying sha256 digest writing manifest success CODE_BLOCK:
➜ ollama pull minimax-m2.1:cloud
pulling manifest pulling 692f7c2439cc: 100% ▕█████████████████████████████████████████████████████████▏ 378 B verifying sha256 digest writing manifest success CODE_BLOCK:
➜ ollama signin
You need to be signed in to Ollama to run Cloud models. To sign in, navigate to: https://ollama.com/connect?name=ubuntu&key=c3NoLWVk...3NlYno Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
➜ ollama signin
You need to be signed in to Ollama to run Cloud models. To sign in, navigate to: https://ollama.com/connect?name=ubuntu&key=c3NoLWVk...3NlYno CODE_BLOCK:
➜ ollama signin
You need to be signed in to Ollama to run Cloud models. To sign in, navigate to: https://ollama.com/connect?name=ubuntu&key=c3NoLWVk...3NlYno CODE_BLOCK:
➜ claude --model minimax-m2.1:cloud Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
➜ claude --model minimax-m2.1:cloud CODE_BLOCK:
➜ claude --model minimax-m2.1:cloud COMMAND_BLOCK:
> 你好 ✽ Beboppin'… (ctrl+c to interrupt) ● 你好!有什麼我可以幫你的嗎? > 你使用哪一種模型? ● 我使用的是 minimax-m2.1:cloud 模型 Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
> 你好 ✽ Beboppin'… (ctrl+c to interrupt) ● 你好!有什麼我可以幫你的嗎? > 你使用哪一種模型? ● 我使用的是 minimax-m2.1:cloud 模型 COMMAND_BLOCK:
> 你好 ✽ Beboppin'… (ctrl+c to interrupt) ● 你好!有什麼我可以幫你的嗎? > 你使用哪一種模型? ● 我使用的是 minimax-m2.1:cloud 模型