官方网站:https://ollama.com/
Github:https://github.com/ollama/ollama
配置apt源
curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg \ && curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list | \ sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | \ sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list
更新源
sudo apt-get update
安装工具包
sudo apt-get install -y nvidia-container-toolkit
Ollama在linux环境中如何支持外部或者宿主机通过IP访问其接口
1、首先停止ollama服务:
systemctl stop ollama
2、修改ollama的service文件:
/etc/systemd/system/ollama.service
在[Service]下边增加一行:
Environment="CUDA_VISIBLE_DEVICES=0,1" #开启GPU Environment="OLLAMA_HOST=0.0.0.0:8080" #端口根据实际情况修改
3、重载daemon文件
systemctl daemon-reload
4、启动ollama服务
systemctl start ollama
使用 GPU 运行 Ollama(重启后 若是无法使用GPU 就用这段话试试)
docker run --gpus all -d -v /opt/ai/ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama