Tools
Tools: 遠端使用 ollama 的方法
2026-01-19
0 views
admin
設定可接受遠端連線 ## Window 環境 ## Linux/MacOS 下手動啟動 ## Linux 下使用 systemctl 啟動的服務 ## 使用 ollama 指令連接遠端的 ollama 服務 ## 使用 API 連接遠端的 ollama ollama 除了可以在本機使用外,也可以從遠端連接,不過預設 ollama 只接受本機的連線,無法從遠端連接。 要讓 ollama 接受遠端連線,必須在啟動 ollama 服務前設定 OLLAMA_HOST 環境變數為 0.0.0.0,否則就只會接收本機連線。設定環境變數時可設定為 0.0.0.0 保留預設連接埠 11434,或是也可以同時指定連接埠,像是 0.0.0.0:11434,不過不建議更改連接埠,以維持最大的相容性。 在安裝完 ollama 後,你可以利用以下指令查看目前 ollama 監聽的 IP: 輸出如下,可以看到目前只監聽本機迴路的 IP: 也可以透過測試連線來確認,首先取得本機的 IP 位址: 為了讓 ollama 可以接受本機迴路以外的連線,你可以在使用者或是系統環境變數加入 OLLAMA_HOST: 即可看到監聽了所有的 IP,如果用 curl 測試: 如果你沒有把 ollama 安裝成自動啟動的服務,就只要在使用者自己的 .bashrc 或是 .zshrc、.profile 檔中匯出環境變數即可,以下先取得本機 IP 位址: 可以看到本機的 IP 是 172.19.149.141: 如果只是要短暫接受非本機連線,也可以直些設定環境變數,例如: 現在不論是本機或是非本機的 IP 都可以正常連線了: 如果希望每次啟動 ollama 都會接收非本機的連線,就可以在 .bashrc、.profile 等 shell 的設定檔中加入: 也可以達到一樣的效果,重新啟動 ollama 後,我從 Windows 連過去測試: 如果是以 systemctl 啟動服務,必須到對應的 service 檔中設定環境變數,你可以在 /etc/systemd/system 或是 /usr/lib/systemd/system 下找到 ollama.service 檔: 底下是我自己的範例,請在 [Service] 區段中找到 Environment 項目,以 "key=value" 的格式加入環境變數,我在這裡加入了 "OLLAMA_HOST=0.0.0.0:11434" 讓 ollama 會監聽所有的 IP。: 你也可以把 Environment 項目分成多行撰寫: 注意:在 Linux 中,重新安裝是更新 Ollama 的唯一方法,但是重新安裝會蓋掉上述服務設定檔,所以請記得更新版本後,要重新修改設定檔,否則就無法從遠端連入使用 Ollama 了。 修改完存檔後必須重新啟動服務才會生效,重新啟動前: 會看到只監聽 128.0.0.1 這個 IP 位址: 輸出如下,現在監聽的是 :::11434,沒有限定只有本機迴路的 IP 了: 現在除了以 localhost 或是 127.0.0.1 連接 ollama 外,也可以使用本機對外的 IP 位址: 你也可以到 Windows 的 PowerShell 測試了: 注意:Ollama 目前在使用者介面上已經有 Expose OLLAMA to the network 選項可以切換是否要讓外部連入,不用撰寫服務設定檔了。 Mac 下比較特別:首先,在 .zshrc 等 shell 相關設定檔中設定的環境變數只對 shell 有效,但是 ollama 在 Mac 上是在使用者登入後透過帳戶的登入項目與延伸功能的設定啟動 ollama 桌面程式(也就是你在工作列看到的小圖示),並且在 ollama 桌面程式的資源檔中設定執行 ollama serve 指令啟動 ollama 服務,因此在 shell 的設定檔中所設定的環境變數完全不會生效。 為了讓 ollama 服務啟動時可以依據環境變數接受非本機連線,我們必須改成使用 launchd 機制啟動 ollama 服務。首先,必須在 ~/Library/LaunchAgents/ 建立一個com.ollama.serve.plist 服務描述檔: 這個服務會在此帳號的使用者登入時啟動,要讓這個服務生效,必須先把已經在執行的 ollama 服務關閉: 然後使用 launchctl 指令載入剛剛寫好的設定: 確認可以正常運作之後,還有最重要的一步,移除原本 ollama 桌面程式在登入後自動執行並啟動 ollama 服務的設定,否則 launchd 啟動服務時會因為已經有在監聽本機迴路的 ollama 而失敗: 移除 ollama 項目後,之後重新開機登入此帳號,就會自動啟動剛剛設定的服務了。 之後若是有修改服務的設定檔,就要先從系統中卸載之前載入的設定檔: 只要設定 OLLAMA_HOST 環境變數為遠端 ollama 服務的位址,就可以連接遠端 ollama 服務,例如: 沒有設定 OLLAMA_HOST 時,這是本機上 ollama 安裝的模型: 我們可以先設定環境變數(以下以 Linux 示範),再重新執行同樣的指令: 現在看到的則是我在 192.168.0.37 位址的 ollama 安裝的模型: 這樣一來,你就可以彈性的配置實際要跑 ollama 模型的機器,其他的機器則只要透過遠端連線就可以使用同一組模型了。如果需要,你也可以直接在系統或是 shell 的設定檔中設定環境變數,就不需要每次執行都要先設定環境變數了。 從遠端利用 API 使用 ollama:只要透過 API 位址的參數指定遠端 IP 即可,本例遠端 ollama 服務位於 192.168.0.37: 使用 ollama 套件則是指定 host 參數: 若要使用 OpenAI API,必須傳入遠端 ollama 伺服器的位址連同連接埠(預設為 11434)給 base_url 參數: Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to ? It will become hidden in your post, but will still be visible via the comment's permalink. as well , this person and/or CODE_BLOCK: netstat -ano | findstr ":11434" CODE_BLOCK: netstat -ano | findstr ":11434" CODE_BLOCK: netstat -ano | findstr ":11434" CODE_BLOCK: TCP 127.0.0.1:3923 127.0.0.1:11434 TIME_WAIT 0 TCP 127.0.0.1:11434 0.0.0.0:0 LISTENING 21544 CODE_BLOCK: TCP 127.0.0.1:3923 127.0.0.1:11434 TIME_WAIT 0 TCP 127.0.0.1:11434 0.0.0.0:0 LISTENING 21544 CODE_BLOCK: TCP 127.0.0.1:3923 127.0.0.1:11434 TIME_WAIT 0 TCP 127.0.0.1:11434 0.0.0.0:0 LISTENING 21544 CODE_BLOCK: Get-NetIPAddress | Select-Object IPAddress, InterfaceAlias, AddressFamily | Where-Object {$_.AddressFamily -eq 'IPv4'} CODE_BLOCK: Get-NetIPAddress | Select-Object IPAddress, InterfaceAlias, AddressFamily | Where-Object {$_.AddressFamily -eq 'IPv4'} CODE_BLOCK: Get-NetIPAddress | Select-Object IPAddress, InterfaceAlias, AddressFamily | Where-Object {$_.AddressFamily -eq 'IPv4'} CODE_BLOCK: IPAddress InterfaceAlias AddressFamily --------- -------------- ------------- 172.19.144.1 vEthernet (WSL (Hyper-V firewall)) IPv4 169.254.130.47 區域連線* 3 IPv4 169.254.236.94 區域連線* 2 IPv4 169.254.221.5 藍牙網路連線 IPv4 192.168.0.71 Wi-Fi IPv4 127.0.0.1 Loopback Pseudo-Interface 1 IPv4 CODE_BLOCK: IPAddress InterfaceAlias AddressFamily --------- -------------- ------------- 172.19.144.1 vEthernet (WSL (Hyper-V firewall)) IPv4 169.254.130.47 區域連線* 3 IPv4 169.254.236.94 區域連線* 2 IPv4 169.254.221.5 藍牙網路連線 IPv4 192.168.0.71 Wi-Fi IPv4 127.0.0.1 Loopback Pseudo-Interface 1 IPv4 CODE_BLOCK: IPAddress InterfaceAlias AddressFamily --------- -------------- ------------- 172.19.144.1 vEthernet (WSL (Hyper-V firewall)) IPv4 169.254.130.47 區域連線* 3 IPv4 169.254.236.94 區域連線* 2 IPv4 169.254.221.5 藍牙網路連線 IPv4 192.168.0.71 Wi-Fi IPv4 127.0.0.1 Loopback Pseudo-Interface 1 IPv4 COMMAND_BLOCK: curl http://127.0.0.1:11434/ COMMAND_BLOCK: curl http://127.0.0.1:11434/ COMMAND_BLOCK: curl http://127.0.0.1:11434/ CODE_BLOCK: Ollama is running CODE_BLOCK: Ollama is running CODE_BLOCK: Ollama is running COMMAND_BLOCK: curl http://192.168.0.71:11434/ COMMAND_BLOCK: curl http://192.168.0.71:11434/ COMMAND_BLOCK: curl http://192.168.0.71:11434/ COMMAND_BLOCK: curl: (7) Failed to connect to 192.168.0.71 port 11434 after 2032 ms: Could not connect to server COMMAND_BLOCK: curl: (7) Failed to connect to 192.168.0.71 port 11434 after 2032 ms: Could not connect to server COMMAND_BLOCK: curl: (7) Failed to connect to 192.168.0.71 port 11434 after 2032 ms: Could not connect to server CODE_BLOCK: $env:OLLAMA_HOST 0.0.0.0:11434 CODE_BLOCK: $env:OLLAMA_HOST 0.0.0.0:11434 CODE_BLOCK: $env:OLLAMA_HOST 0.0.0.0:11434 CODE_BLOCK: netstat -ano | findstr ":11434" CODE_BLOCK: netstat -ano | findstr ":11434" CODE_BLOCK: netstat -ano | findstr ":11434" CODE_BLOCK: TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 11832 TCP [::]:11434 [::]:0 LISTENING 11832 CODE_BLOCK: TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 11832 TCP [::]:11434 [::]:0 LISTENING 11832 CODE_BLOCK: TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 11832 TCP [::]:11434 [::]:0 LISTENING 11832 COMMAND_BLOCK: curl http://192.168.0.71:11434/ Ollama is running COMMAND_BLOCK: curl http://192.168.0.71:11434/ Ollama is running COMMAND_BLOCK: curl http://192.168.0.71:11434/ Ollama is running CODE_BLOCK: ip addr show eth0 | grep inet CODE_BLOCK: ip addr show eth0 | grep inet CODE_BLOCK: ip addr show eth0 | grep inet CODE_BLOCK: inet 172.19.149.141/20 brd 172.19.159.255 scope global eth0 inet6 fe80::215:5dff:fe40:a05c/64 scope link CODE_BLOCK: inet 172.19.149.141/20 brd 172.19.159.255 scope global eth0 inet6 fe80::215:5dff:fe40:a05c/64 scope link CODE_BLOCK: inet 172.19.149.141/20 brd 172.19.159.255 scope global eth0 inet6 fe80::215:5dff:fe40:a05c/64 scope link COMMAND_BLOCK: curl http://localhost:11434/ Ollama is running curl http://172.19.149.141:11434/ curl: (7) Failed to connect to 172.19.149.141 port 11434 after 0 ms: Couldn't connect to server COMMAND_BLOCK: curl http://localhost:11434/ Ollama is running curl http://172.19.149.141:11434/ curl: (7) Failed to connect to 172.19.149.141 port 11434 after 0 ms: Couldn't connect to server COMMAND_BLOCK: curl http://localhost:11434/ Ollama is running curl http://172.19.149.141:11434/ curl: (7) Failed to connect to 172.19.149.141 port 11434 after 0 ms: Couldn't connect to server CODE_BLOCK: pkill ollama CODE_BLOCK: pkill ollama CODE_BLOCK: pkill ollama CODE_BLOCK: OLLAMA_HOST=0.0.0.0 ollama serve & CODE_BLOCK: OLLAMA_HOST=0.0.0.0 ollama serve & CODE_BLOCK: OLLAMA_HOST=0.0.0.0 ollama serve & COMMAND_BLOCK: curl http://localhost:11434/ Ollama is running COMMAND_BLOCK: curl http://localhost:11434/ Ollama is running COMMAND_BLOCK: curl http://localhost:11434/ Ollama is running COMMAND_BLOCK: curl http://172.19.149.141:11434/ Ollama is running COMMAND_BLOCK: curl http://172.19.149.141:11434/ Ollama is running COMMAND_BLOCK: curl http://172.19.149.141:11434/ Ollama is running CODE_BLOCK: export OLLAMA_HOST=0.0.0.0 CODE_BLOCK: export OLLAMA_HOST=0.0.0.0 CODE_BLOCK: export OLLAMA_HOST=0.0.0.0 CODE_BLOCK: (Invoke-WebRequest -Uri "http://172.19.149.141:11434/").Content Ollama is running CODE_BLOCK: (Invoke-WebRequest -Uri "http://172.19.149.141:11434/").Content Ollama is running CODE_BLOCK: (Invoke-WebRequest -Uri "http://172.19.149.141:11434/").Content Ollama is running CODE_BLOCK: cat /etc/systemd/system/ollama.service CODE_BLOCK: cat /etc/systemd/system/ollama.service CODE_BLOCK: cat /etc/systemd/system/ollama.service CODE_BLOCK: [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=<安裝時使用者的 PATH 路徑設定>" "OLLAMA_HOST=0.0.0.0:11434" [Install] WantedBy=default.target CODE_BLOCK: [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=<安裝時使用者的 PATH 路徑設定>" "OLLAMA_HOST=0.0.0.0:11434" [Install] WantedBy=default.target CODE_BLOCK: [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=<安裝時使用者的 PATH 路徑設定>" "OLLAMA_HOST=0.0.0.0:11434" [Install] WantedBy=default.target CODE_BLOCK: Environment="PATH=<安裝時使用者的 PATH 路徑設定>" Environment="OLLAMA_HOST=0.0.0.0:11434" CODE_BLOCK: Environment="PATH=<安裝時使用者的 PATH 路徑設定>" Environment="OLLAMA_HOST=0.0.0.0:11434" CODE_BLOCK: Environment="PATH=<安裝時使用者的 PATH 路徑設定>" Environment="OLLAMA_HOST=0.0.0.0:11434" COMMAND_BLOCK: sudo apt install net-tools sudo netstat -anp | grep ollama COMMAND_BLOCK: sudo apt install net-tools sudo netstat -anp | grep ollama COMMAND_BLOCK: sudo apt install net-tools sudo netstat -anp | grep ollama CODE_BLOCK: tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 120/ollama unix 3 [ ] STREAM CONNECTED 1342 120/ollama CODE_BLOCK: tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 120/ollama unix 3 [ ] STREAM CONNECTED 1342 120/ollama CODE_BLOCK: tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 120/ollama unix 3 [ ] STREAM CONNECTED 1342 120/ollama COMMAND_BLOCK: sudo systemctl daemon-reload sudo systemctl restart ollama COMMAND_BLOCK: sudo systemctl daemon-reload sudo systemctl restart ollama COMMAND_BLOCK: sudo systemctl daemon-reload sudo systemctl restart ollama COMMAND_BLOCK: sudo netstat -anp | grep ollama COMMAND_BLOCK: sudo netstat -anp | grep ollama COMMAND_BLOCK: sudo netstat -anp | grep ollama CODE_BLOCK: tcp6 0 0 :::11434 :::* LISTEN 1092/ollama CODE_BLOCK: tcp6 0 0 :::11434 :::* LISTEN 1092/ollama CODE_BLOCK: tcp6 0 0 :::11434 :::* LISTEN 1092/ollama CODE_BLOCK: ip addr show eth0 | grep inet CODE_BLOCK: ip addr show eth0 | grep inet CODE_BLOCK: ip addr show eth0 | grep inet CODE_BLOCK: inet 172.19.149.141/20 brd 172.19.159.255 scope global eth0 inet6 fe80::215:5dff:fead:ec41/64 scope link CODE_BLOCK: inet 172.19.149.141/20 brd 172.19.159.255 scope global eth0 inet6 fe80::215:5dff:fead:ec41/64 scope link CODE_BLOCK: inet 172.19.149.141/20 brd 172.19.159.255 scope global eth0 inet6 fe80::215:5dff:fead:ec41/64 scope link COMMAND_BLOCK: curl http://172.19.149.141:11434/ COMMAND_BLOCK: curl http://172.19.149.141:11434/ COMMAND_BLOCK: curl http://172.19.149.141:11434/ CODE_BLOCK: Ollama is running CODE_BLOCK: Ollama is running CODE_BLOCK: Ollama is running CODE_BLOCK: Invoke-WebRequest -Uri "http://172.19.149.141:11434/" | Select-Object -ExpandProperty Content CODE_BLOCK: Invoke-WebRequest -Uri "http://172.19.149.141:11434/" | Select-Object -ExpandProperty Content CODE_BLOCK: Invoke-WebRequest -Uri "http://172.19.149.141:11434/" | Select-Object -ExpandProperty Content CODE_BLOCK: Ollama is running CODE_BLOCK: Ollama is running CODE_BLOCK: Ollama is running CODE_BLOCK: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>Label</key> <string>com.ollama.serve</string> <key>ProgramArguments</key> <array> <string>/Applications/Ollama.app/Contents/Resources/ollama</string> <string>serve</string> </array> <key>EnvironmentVariables</key> <dict> <key>OLLAMA_HOST</key> <string>0.0.0.0:11434</string> </dict> <key>RunAtLoad</key> <true/> <key>KeepAlive</key> <true/> </dict> </plist> CODE_BLOCK: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>Label</key> <string>com.ollama.serve</string> <key>ProgramArguments</key> <array> <string>/Applications/Ollama.app/Contents/Resources/ollama</string> <string>serve</string> </array> <key>EnvironmentVariables</key> <dict> <key>OLLAMA_HOST</key> <string>0.0.0.0:11434</string> </dict> <key>RunAtLoad</key> <true/> <key>KeepAlive</key> <true/> </dict> </plist> CODE_BLOCK: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>Label</key> <string>com.ollama.serve</string> <key>ProgramArguments</key> <array> <string>/Applications/Ollama.app/Contents/Resources/ollama</string> <string>serve</string> </array> <key>EnvironmentVariables</key> <dict> <key>OLLAMA_HOST</key> <string>0.0.0.0:11434</string> </dict> <key>RunAtLoad</key> <true/> <key>KeepAlive</key> <true/> </dict> </plist> CODE_BLOCK: killall ollama CODE_BLOCK: killall ollama CODE_BLOCK: killall ollama CODE_BLOCK: launchctl load ~/Library/LaunchAgents/com.ollama.serve.plist CODE_BLOCK: launchctl load ~/Library/LaunchAgents/com.ollama.serve.plist CODE_BLOCK: launchctl load ~/Library/LaunchAgents/com.ollama.serve.plist CODE_BLOCK: ipconfig getifaddr en0 192.168.0.37 CODE_BLOCK: ipconfig getifaddr en0 192.168.0.37 CODE_BLOCK: ipconfig getifaddr en0 192.168.0.37 COMMAND_BLOCK: curl http://192.168.0.37:11434 Ollama is running COMMAND_BLOCK: curl http://192.168.0.37:11434 Ollama is running COMMAND_BLOCK: curl http://192.168.0.37:11434 Ollama is running CODE_BLOCK: launchctl unload ~/Library/LaunchAgents/com.ollama.serve.plist CODE_BLOCK: launchctl unload ~/Library/LaunchAgents/com.ollama.serve.plist CODE_BLOCK: launchctl unload ~/Library/LaunchAgents/com.ollama.serve.plist CODE_BLOCK: ollama list CODE_BLOCK: ollama list CODE_BLOCK: ollama list CODE_BLOCK: NAME ID SIZE MODIFIED weitsung50110/llama-3-taiwan:8b-instruct-dpo-q4_K_M 7fcf06fa5eaa 4.9 GB 5 days ago weitsung50110/multilingual-e5-large-instruct:f16 ad6d1b04ec00 1.1 GB 2 weeks ago gemma3:latest a2af6cc3eb7f 3.3 GB 2 weeks ago CODE_BLOCK: NAME ID SIZE MODIFIED weitsung50110/llama-3-taiwan:8b-instruct-dpo-q4_K_M 7fcf06fa5eaa 4.9 GB 5 days ago weitsung50110/multilingual-e5-large-instruct:f16 ad6d1b04ec00 1.1 GB 2 weeks ago gemma3:latest a2af6cc3eb7f 3.3 GB 2 weeks ago CODE_BLOCK: NAME ID SIZE MODIFIED weitsung50110/llama-3-taiwan:8b-instruct-dpo-q4_K_M 7fcf06fa5eaa 4.9 GB 5 days ago weitsung50110/multilingual-e5-large-instruct:f16 ad6d1b04ec00 1.1 GB 2 weeks ago gemma3:latest a2af6cc3eb7f 3.3 GB 2 weeks ago CODE_BLOCK: OLLAMA_HOST=192.168.0.37 ollama list CODE_BLOCK: OLLAMA_HOST=192.168.0.37 ollama list CODE_BLOCK: OLLAMA_HOST=192.168.0.37 ollama list CODE_BLOCK: NAME ID SIZE MODIFIED gemma3:4b a2af6cc3eb7f 3.3 GB 38 minutes ago weitsung50110/multilingual-e5-large-instruct:f16 ad6d1b04ec00 1.1 GB 3 hours ago weitsung50110/llama-3-taiwan:8b-instruct-dpo-q4_K_M 7fcf06fa5eaa 4.9 GB 3 hours ago gemma3:12b f4031aab637d 8.1 GB 24 hours ago gemma3:1b 8648f39daa8f 815 MB 24 hours ago Darrrrr/mymodel:latest 40a13e3b82e2 815 MB 12 days ago deepseek-r1:14b ea35dfe18182 9.0 GB 12 days ago qwq:latest 009cb3f08d74 19 GB 2 weeks ago gemma3:27b a418f5838eaf 17 GB 2 weeks ago deepseek-r1:70b 0c1615a8ca32 42 GB 2 weeks ago deepseek-r1:32b 38056bbcbb2d 19 GB 2 weeks ago CODE_BLOCK: NAME ID SIZE MODIFIED gemma3:4b a2af6cc3eb7f 3.3 GB 38 minutes ago weitsung50110/multilingual-e5-large-instruct:f16 ad6d1b04ec00 1.1 GB 3 hours ago weitsung50110/llama-3-taiwan:8b-instruct-dpo-q4_K_M 7fcf06fa5eaa 4.9 GB 3 hours ago gemma3:12b f4031aab637d 8.1 GB 24 hours ago gemma3:1b 8648f39daa8f 815 MB 24 hours ago Darrrrr/mymodel:latest 40a13e3b82e2 815 MB 12 days ago deepseek-r1:14b ea35dfe18182 9.0 GB 12 days ago qwq:latest 009cb3f08d74 19 GB 2 weeks ago gemma3:27b a418f5838eaf 17 GB 2 weeks ago deepseek-r1:70b 0c1615a8ca32 42 GB 2 weeks ago deepseek-r1:32b 38056bbcbb2d 19 GB 2 weeks ago CODE_BLOCK: NAME ID SIZE MODIFIED gemma3:4b a2af6cc3eb7f 3.3 GB 38 minutes ago weitsung50110/multilingual-e5-large-instruct:f16 ad6d1b04ec00 1.1 GB 3 hours ago weitsung50110/llama-3-taiwan:8b-instruct-dpo-q4_K_M 7fcf06fa5eaa 4.9 GB 3 hours ago gemma3:12b f4031aab637d 8.1 GB 24 hours ago gemma3:1b 8648f39daa8f 815 MB 24 hours ago Darrrrr/mymodel:latest 40a13e3b82e2 815 MB 12 days ago deepseek-r1:14b ea35dfe18182 9.0 GB 12 days ago qwq:latest 009cb3f08d74 19 GB 2 weeks ago gemma3:27b a418f5838eaf 17 GB 2 weeks ago deepseek-r1:70b 0c1615a8ca32 42 GB 2 weeks ago deepseek-r1:32b 38056bbcbb2d 19 GB 2 weeks ago COMMAND_BLOCK: curl http://192.168.0.37:11434/api/chat -d '{ > "model": "gemma3:4b", > "messages": [ > { > "role": "user", > "content": "你好" > } > ], > "stream": false > }' COMMAND_BLOCK: curl http://192.168.0.37:11434/api/chat -d '{ > "model": "gemma3:4b", > "messages": [ > { > "role": "user", > "content": "你好" > } > ], > "stream": false > }' COMMAND_BLOCK: curl http://192.168.0.37:11434/api/chat -d '{ > "model": "gemma3:4b", > "messages": [ > { > "role": "user", > "content": "你好" > } > ], > "stream": false > }' CODE_BLOCK: {"model":"gemma3:4b","created_at":"2025-04-30T09:48:53.946791Z","message":{"role":"assistant","content":"你好!很高兴和你聊天。有什么我可以帮助你的吗?\n"},"done_reason":"stop","done":true,"total_duration":361840209,"load_duration":57624667,"prompt_eval_count":10,"prompt_eval_duration":79502667,"eval_count":15,"eval_duration":224293500} CODE_BLOCK: {"model":"gemma3:4b","created_at":"2025-04-30T09:48:53.946791Z","message":{"role":"assistant","content":"你好!很高兴和你聊天。有什么我可以帮助你的吗?\n"},"done_reason":"stop","done":true,"total_duration":361840209,"load_duration":57624667,"prompt_eval_count":10,"prompt_eval_duration":79502667,"eval_count":15,"eval_duration":224293500} CODE_BLOCK: {"model":"gemma3:4b","created_at":"2025-04-30T09:48:53.946791Z","message":{"role":"assistant","content":"你好!很高兴和你聊天。有什么我可以帮助你的吗?\n"},"done_reason":"stop","done":true,"total_duration":361840209,"load_duration":57624667,"prompt_eval_count":10,"prompt_eval_duration":79502667,"eval_count":15,"eval_duration":224293500} CODE_BLOCK: import asyncio from ollama import AsyncClient async def main(): client = AsyncClient(host='http://192.168.0.37:11434') response = await client.chat( model='gemma3:4b', messages=[{'role': 'user', 'content': '你是誰?'}] ) print('回應:', response['message']['content']) asyncio.run(main()) CODE_BLOCK: import asyncio from ollama import AsyncClient async def main(): client = AsyncClient(host='http://192.168.0.37:11434') response = await client.chat( model='gemma3:4b', messages=[{'role': 'user', 'content': '你是誰?'}] ) print('回應:', response['message']['content']) asyncio.run(main()) CODE_BLOCK: import asyncio from ollama import AsyncClient async def main(): client = AsyncClient(host='http://192.168.0.37:11434') response = await client.chat( model='gemma3:4b', messages=[{'role': 'user', 'content': '你是誰?'}] ) print('回應:', response['message']['content']) asyncio.run(main()) COMMAND_BLOCK: from openai import OpenAI # 配置遠端 Ollama API OLLAMA_API_URL = "http://192.168.0.37:11434/v1" # 遠端 Ollama 伺服器 # OLLAMA_API_URL = "http://127.0.0.1:11434/v1" # 本機 Ollama 伺服器 MODEL_NAME = "gemma3:12b" # 初始化 OpenAI 客戶端,指向 Ollama 的 API client = OpenAI( base_url=OLLAMA_API_URL, ) # 發送聊天完成請求 response = client.chat.completions.create( model=MODEL_NAME, messages=[ { "role": "user", "content": "你好" } ], ) # 提取回應內容 print(response.choices[0].message.content) COMMAND_BLOCK: from openai import OpenAI # 配置遠端 Ollama API OLLAMA_API_URL = "http://192.168.0.37:11434/v1" # 遠端 Ollama 伺服器 # OLLAMA_API_URL = "http://127.0.0.1:11434/v1" # 本機 Ollama 伺服器 MODEL_NAME = "gemma3:12b" # 初始化 OpenAI 客戶端,指向 Ollama 的 API client = OpenAI( base_url=OLLAMA_API_URL, ) # 發送聊天完成請求 response = client.chat.completions.create( model=MODEL_NAME, messages=[ { "role": "user", "content": "你好" } ], ) # 提取回應內容 print(response.choices[0].message.content) COMMAND_BLOCK: from openai import OpenAI # 配置遠端 Ollama API OLLAMA_API_URL = "http://192.168.0.37:11434/v1" # 遠端 Ollama 伺服器 # OLLAMA_API_URL = "http://127.0.0.1:11434/v1" # 本機 Ollama 伺服器 MODEL_NAME = "gemma3:12b" # 初始化 OpenAI 客戶端,指向 Ollama 的 API client = OpenAI( base_url=OLLAMA_API_URL, ) # 發送聊天完成請求 response = client.chat.completions.create( model=MODEL_NAME, messages=[ { "role": "user", "content": "你好" } ], ) # 提取回應內容 print(response.choices[0].message.content) - Label 鍵:此服務的名稱。 - ProgramArgs 鍵:此服務要執行的指令。 - EnvironmentVariables 鍵:要提供給此服務執行時的環境變數
toolsutilitiessecurity toolsollamawindowsystemctlollama_host