From 0534d7254fdeb2beae82a46dbea79b684e58d255 Mon Sep 17 00:00:00 2001 From: logicat <35831253+ca1ic0@users.noreply.github.com> Date: Wed, 8 Jan 2025 09:56:56 +0800 Subject: [PATCH] Update docker_cpp_xpu_quickstart.md (#12667) --- docs/mddocs/DockerGuides/docker_cpp_xpu_quickstart.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/mddocs/DockerGuides/docker_cpp_xpu_quickstart.md b/docs/mddocs/DockerGuides/docker_cpp_xpu_quickstart.md index e2bcde4e..d57ab55b 100644 --- a/docs/mddocs/DockerGuides/docker_cpp_xpu_quickstart.md +++ b/docs/mddocs/DockerGuides/docker_cpp_xpu_quickstart.md @@ -131,7 +131,7 @@ Please refer to this [documentation](../Quickstart/llama_cpp_quickstart.md) for Running the ollama on the background, you can see the ollama.log in `/root/ollama/ollama.log` ```bash cd /llm/scripts/ -# set the recommended Env +# (optional) set the recommended Env , it might cause err. When meet err, you can directly start ollama without set the env. source ipex-llm-init --gpu --device $DEVICE bash start-ollama.sh # ctrl+c to exit, and the ollama serve will run on the background ```