
【linux/win/mac 使用ollama部署本地大模型完整的教程,openai接口, llama3.2示例】
·
# 安装相应的包
# linux 安装
$ curl -fsSL https://ollama.com/install.sh | sh
# win 安装客户端
https://ollama.com/download/OllamaSetup.exe
# mac 安装客户端
https://ollama.com/download/Ollama-darwin.zip
# 安装ollama python包
$ pip install ollama
## 如果有网络问题
如果遇到网络问题的解决办法,可以修改下载源
打开 ollama_install.sh,找到以下两个下载地址:
https://ollama.com/download/ollama-linux-${ARCH}${VER_PARAM}
https://ollama.com/download/ollama-linux-amd64-rocm.tgz${VER_PARAM}
我们要将这两个地址替换成 GitHub 的下载链接。但直接使用 GitHub 可能依旧缓慢,因此我们推荐使用 GitHub 文件加速服务。
使用以下脚本来修改下载源:
#!/bin/bash
# 文件路径
FILE="ollama_install.sh"
# 修改 URL
sed -i 's|https://ollama.com/download/ollama-linux-${ARCH}${VER_PARAM}|https://github.moeyy.xyz/https://github.com/ollama/ollama/releases/download/v0.3.4/ollama-linux-amd64|g' $FILE
sed -i 's|https://ollama.com/download/ollama-linux-amd64-rocm.tgz${VER_PARAM}|https://github.moeyy.xyz/https://github.com/ollama/ollama/releases/download/v0.3.4/ollama-linux-amd64-rocm.tgz|g' $FILE
最终修改为:
https://github.moeyy.xyz/https://github.com/ollama/ollama/releases/download/v0.3.4/ollama-linux-amd64
https://github.moeyy.xyz/https://github.com/ollama/ollama/releases/download/v0.3.4/ollama-linux-amd64-rocm.tgz
# 开启ollama服务端!
$ ollama serve
# 启动llama3.2大模型(新开一个终端)模型库1ollama labrary
https://ollama.com/library
模型库2 modelscope models
https://www.modelscope.cn/models
# autodl开启加速(其他平台省略)
$ source /etc/network_turbo
$ ollama run llama3.2
只下载模型
# 拉取模型
$ ollama pull llama3.2
在启动完后,就可以对话了

# python接口对话
import ollama
response = ollama.chat(model='llama2', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])

# OpenAI适配接口对话
from openai import OpenAI
client = OpenAI(
base_url = 'http://localhost:11434/v1',
api_key='ollama', # required, but unused
)
response = client.chat.completions.create(
model="llama2",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The LA Dodgers won in 2020."},
{"role": "user", "content": "Where was it played?"}
]
)
print(response.choices[0].message.content)

# CUR流式接口
curl -X POST http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt":"Why is the sky blue?"
}'

# 参考
llama2 (ollama.com)
https://ollama.com/library/llama2
OpenAI compatibility · Ollama Blog
https://ollama.com/blog/openai-compatibility
更多推荐




所有评论(0)