diff --git a/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.md b/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.md index f34c974f..c31abbf9 100644 --- a/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.md +++ b/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.md @@ -79,4 +79,19 @@ To increase the context length, you could set environment variable `IPEX_LLM_NUM - Start Ollama serve through `start-ollama.bat` > [!TIP] -> `IPEX_LLM_NUM_CTX` has a higher priority than the `num_ctx` settings in a models' `Modelfile`. \ No newline at end of file +> `IPEX_LLM_NUM_CTX` has a higher priority than the `num_ctx` settings in a models' `Modelfile`. + +### Additional models supported after Ollama v0.5.4 + +The currently Ollama Portable Zip is based on Ollama v0.5.4; in addition, the following new models have also been supported in the Ollama Portable Zip: + + | Model | Download | Model Link | + | - | - | - | + | DeepSeek-R1 | `ollama run deepseek-r1` | [deepseek-r1](https://ollama.com/library/deepseek-r1) | + | Openthinker | `ollama run openthinker` | [openthinker](https://ollama.com/library/openthinker) | + | DeepScaleR | `ollama run deepscaler` | [deepscaler](https://ollama.com/library/deepscaler) | + | Phi-4 | `ollama run phi4` | [phi4](https://ollama.com/library/phi4) | + | Dolphin 3.0 | `ollama run dolphin3` | [dolphin3](https://ollama.com/library/dolphin3) | + | Smallthinker | `ollama run smallthinker` | [smallthinker](https://ollama.com/library/smallthinker) | + | Granite3.1-Dense | `ollama run granite3-dense` | [granite3.1-dense](https://ollama.com/library/granite3.1-dense) | + | Granite3.1-Moe-3B | `ollama run granite3-moe` | [granite3.1-moe](https://ollama.com/library/granite3.1-moe) | diff --git a/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.zh-CN.md b/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.zh-CN.md index d8d5e49b..dab796f0 100644 --- a/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.zh-CN.md +++ b/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.zh-CN.md @@ -13,6 +13,7 @@ - [步骤 1:下载和解压](#步骤-1下载和解压) - [步骤 2:启动 Ollama Serve](#步骤-2启动-ollama-serve) - [步骤 3:运行 Ollama](#步骤-3运行-ollama) +- [提示和故障排除](#提示和故障排除) ## 系统环境准备 @@ -46,3 +47,51 @@
+
### Troubleshooting
#### 1. Unable to run the initialization script
If you are unable to run `init-ollama.bat`, please make sure you have installed `ipex-llm[cpp]` in your conda environment. If you have installed it, please check if you have activated the correct conda environment. Also, if you are using Windows, please make sure you have run the script with administrator privilege in prompt terminal.
diff --git a/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md b/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md
index 34b158c5..dd579ec7 100644
--- a/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md
+++ b/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md
@@ -12,7 +12,7 @@
> 如果是在 Intel Arc B 系列 GPU 上安装(例如 **B580**),请参阅本[指南](./bmg_quickstart.md)。
> [!NOTE]
-> `ipex-llm[cpp]` 的最新版本与官方 ollama 的 [v0.5.1](https://github.com/ollama/ollama/releases/tag/v0.5.4) 版本保持一致。
+> `ipex-llm[cpp]` 的最新版本与官方 ollama 的 [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) 版本保持一致。
>
> `ipex-llm[cpp]==2.2.0b20250123` 与官方 ollama 的 [v0.5.1](https://github.com/ollama/ollama/releases/tag/v0.5.1) 版本保持一致。
@@ -28,9 +28,9 @@
> [!NOTE]
-> 从 `ipex-llm[cpp]==2.2.0b20240912` 版本开始,Windows 上 `ipex-llm[cpp]` 依赖的 oneAPI 版本已从 `2024.0.0` 更新到 `2024.2.1`。
+> 从 `ipex-llm[cpp]==2.2.0b20250207` 版本开始,Windows 上 `ipex-llm[cpp]` 依赖的 oneAPI 版本已从 `2024.2.1` 更新到 `2025.0.1`。
>
-> 如果要将 `ipex-llm[cpp]` 升级到 `2.2.0b20240912` 或更高版本,在Windows环境下,你需要新建一个干净的 conda 环境来安装新版本。如果直接在旧的 conda 环境中卸载旧版本并升级,可能会遇到 `找不到 sycl7.dll` 的错误。
+> 如果要将 `ipex-llm[cpp]` 升级到 `2.2.0b20250207` 或更高版本,在Windows环境下,你需要新建一个干净的 conda 环境来安装新版本。如果直接在旧的 conda 环境中卸载旧版本并升级,可能会遇到 `找不到 sycl8.dll` 的错误。
## 目录
- [安装 IPEX-LLM 来使用 Ollama](./ollama_quickstart.zh-CN.md#1-安装-ipex-llm-来使用-Ollama)