Update ollama-release doc into v0.6.2 (#13094)

* Update ollama-release doc into v0.6.2

* update

* revert signature changes
This commit is contained in:
SONG Ge 2025-04-30 16:22:42 +08:00 committed by GitHub
parent 51b41faad7
commit 685a749adb
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
4 changed files with 13 additions and 10 deletions

View file

@ -28,7 +28,7 @@ This guide demonstrates how to use [Ollama portable zip](https://github.com/ipex
- [Increase context length in Ollama](#increase-context-length-in-ollama)
- [Select specific GPU(s) to run Ollama when multiple ones are available](#select-specific-gpus-to-run-ollama-when-multiple-ones-are-available)
- [Tune performance](#tune-performance)
- [Additional models supported after Ollama v0.5.4](#additional-models-supported-after-ollama-v054)
- [Additional models supported after Ollama v0.6.2](#additional-models-supported-after-ollama-v062)
- [Signature Verification](#signature-verification)
- [More details](ollama_quickstart.md)
@ -74,8 +74,10 @@ Check your GPU driver version, and update it if needed; we recommend following [
### Step 1: Download and Extract
Download IPEX-LLM Ollama portable tgz for Ubuntu users from the [link](https://github.com/ipex-llm/ipex-llm/releases/tag/v2.3.0-nightly).
Then open a terminal, extract the tgz file to a folder.
```bash
@ -206,9 +208,9 @@ To enable `SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS`, set it **before start
> [!TIP]
> You could refer to [here](https://www.intel.com/content/www/us/en/developer/articles/guide/level-zero-immediate-command-lists.html) regarding more information about Level Zero Immediate Command Lists.
### Additional models supported after Ollama v0.5.4
### Additional models supported after Ollama v0.6.2
The currently Ollama Portable Zip is based on Ollama v0.5.4; in addition, the following new models have also been supported in the Ollama Portable Zip:
The currently Ollama Portable Zip is based on Ollama v0.6.2; in addition, the following new models have also been supported in the Ollama Portable Zip:
| Model | Download (Windows) | Download (Linux) | Model Link |
| - | - | - | - |

View file

@ -3,6 +3,7 @@
< <a href='./ollama_portable_zip_quickstart.md'>English</a> | <b>中文</b> >
</p>
本指南演示如何使用 [Ollama portable zip](https://github.com/ipex-llm/ipex-llm/releases/tag/v2.3.0-nightly) 通过 `ipex-llm` 在 Intel GPU 上直接免安装运行 Ollama。
> [!NOTE]
@ -28,7 +29,7 @@
- [在 Ollama 中增加上下文长度](#在-ollama-中增加上下文长度)
- [在多块 GPU 可用时选择特定的 GPU 来运行 Ollama](#在多块-gpu-可用时选择特定的-gpu-来运行-ollama)
- [性能调优](#性能调优)
- [Ollama v0.5.4 之后新增模型支持](#ollama-v054-之后新增模型支持)
- [Ollama v0.6.2 之后新增模型支持](#ollama-v062-之后新增模型支持)
- [签名验证](#签名验证)
- [更多信息](ollama_quickstart.zh-CN.md)
@ -205,9 +206,9 @@ Ollama 默认从 Ollama 库下载模型。通过在**运行 Ollama 之前**设
> [!TIP]
> 参考[此处文档](https://www.intel.com/content/www/us/en/developer/articles/guide/level-zero-immediate-command-lists.html)以获取更多 Level Zero Immediate Command Lists 相关信息。
### Ollama v0.5.4 之后新增模型支持
### Ollama v0.6.2 之后新增模型支持
当前的 Ollama Portable Zip 基于 Ollama v0.5.4;此外,以下新模型也已在 Ollama Portable Zip 中得到支持:
当前的 Ollama Portable Zip 基于 Ollama v0.6.2;此外,以下新模型也已在 Ollama Portable Zip 中得到支持:
| 模型 | 下载Windows| 下载Linux| 模型链接 |
| - | - | - | - |

View file

@ -12,9 +12,9 @@
> For installation on Intel Arc B-Series GPU (such as **B580**), please refer to this [guide](./bmg_quickstart.md).
> [!NOTE]
> Our current version is consistent with [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) of ollama.
> Our current version is consistent with [v0.6.2](https://github.com/ollama/ollama/releases/tag/v0.6.2) of ollama.
>
> `ipex-llm[cpp]==2.2.0b20250123` is consistent with [v0.5.1](https://github.com/ollama/ollama/releases/tag/v0.5.1) of ollama.
> `ipex-llm[cpp]==2.2.0b20250413` is consistent with [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) of ollama.
See the demo of running LLaMA2-7B on Intel Arc GPU below.

View file

@ -12,9 +12,9 @@
> 如果是在 Intel Arc B 系列 GPU 上安装(例如 **B580**),请参阅本[指南](./bmg_quickstart.md)。
> [!NOTE]
> `ipex-llm[cpp]` 的最新版本与官方 ollama 的 [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) 版本保持一致。
> `ipex-llm[cpp]` 的最新版本与官方 ollama 的 [v0.6.2](https://github.com/ollama/ollama/releases/tag/v0.6.2) 版本保持一致。
>
> `ipex-llm[cpp]==2.2.0b20250123` 与官方 ollama 的 [v0.5.1](https://github.com/ollama/ollama/releases/tag/v0.5.1) 版本保持一致。
> `ipex-llm[cpp]==2.2.0b20250413` 与官方 ollama 的 [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) 版本保持一致。
以下是在 Intel Arc GPU 上运行 LLaMA2-7B 的 DEMO 演示。