From 685a749adb8ac4355a813b678a6286e866162d87 Mon Sep 17 00:00:00 2001 From: SONG Ge <38711238+sgwhat@users.noreply.github.com> Date: Wed, 30 Apr 2025 16:22:42 +0800 Subject: [PATCH] Update ollama-release doc into v0.6.2 (#13094) * Update ollama-release doc into v0.6.2 * update * revert signature changes --- docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md | 8 +++++--- .../Quickstart/ollama_portable_zip_quickstart.zh-CN.md | 7 ++++--- docs/mddocs/Quickstart/ollama_quickstart.md | 4 ++-- docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md | 4 ++-- 4 files changed, 13 insertions(+), 10 deletions(-) diff --git a/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md b/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md index c4d161cb..b98d4aff 100644 --- a/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md +++ b/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md @@ -28,7 +28,7 @@ This guide demonstrates how to use [Ollama portable zip](https://github.com/ipex - [Increase context length in Ollama](#increase-context-length-in-ollama) - [Select specific GPU(s) to run Ollama when multiple ones are available](#select-specific-gpus-to-run-ollama-when-multiple-ones-are-available) - [Tune performance](#tune-performance) - - [Additional models supported after Ollama v0.5.4](#additional-models-supported-after-ollama-v054) + - [Additional models supported after Ollama v0.6.2](#additional-models-supported-after-ollama-v062) - [Signature Verification](#signature-verification) - [More details](ollama_quickstart.md) @@ -74,8 +74,10 @@ Check your GPU driver version, and update it if needed; we recommend following [ ### Step 1: Download and Extract + Download IPEX-LLM Ollama portable tgz for Ubuntu users from the [link](https://github.com/ipex-llm/ipex-llm/releases/tag/v2.3.0-nightly). + Then open a terminal, extract the tgz file to a folder. ```bash @@ -206,9 +208,9 @@ To enable `SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS`, set it **before start > [!TIP] > You could refer to [here](https://www.intel.com/content/www/us/en/developer/articles/guide/level-zero-immediate-command-lists.html) regarding more information about Level Zero Immediate Command Lists. -### Additional models supported after Ollama v0.5.4 +### Additional models supported after Ollama v0.6.2 -The currently Ollama Portable Zip is based on Ollama v0.5.4; in addition, the following new models have also been supported in the Ollama Portable Zip: +The currently Ollama Portable Zip is based on Ollama v0.6.2; in addition, the following new models have also been supported in the Ollama Portable Zip: | Model | Download (Windows) | Download (Linux) | Model Link | | - | - | - | - | diff --git a/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.zh-CN.md b/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.zh-CN.md index 5b3660ab..3e5db7dd 100644 --- a/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.zh-CN.md +++ b/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.zh-CN.md @@ -3,6 +3,7 @@ < English | 中文 >

+ 本指南演示如何使用 [Ollama portable zip](https://github.com/ipex-llm/ipex-llm/releases/tag/v2.3.0-nightly) 通过 `ipex-llm` 在 Intel GPU 上直接免安装运行 Ollama。 > [!NOTE] @@ -28,7 +29,7 @@ - [在 Ollama 中增加上下文长度](#在-ollama-中增加上下文长度) - [在多块 GPU 可用时选择特定的 GPU 来运行 Ollama](#在多块-gpu-可用时选择特定的-gpu-来运行-ollama) - [性能调优](#性能调优) - - [Ollama v0.5.4 之后新增模型支持](#ollama-v054-之后新增模型支持) + - [Ollama v0.6.2 之后新增模型支持](#ollama-v062-之后新增模型支持) - [签名验证](#签名验证) - [更多信息](ollama_quickstart.zh-CN.md) @@ -205,9 +206,9 @@ Ollama 默认从 Ollama 库下载模型。通过在**运行 Ollama 之前**设 > [!TIP] > 参考[此处文档](https://www.intel.com/content/www/us/en/developer/articles/guide/level-zero-immediate-command-lists.html)以获取更多 Level Zero Immediate Command Lists 相关信息。 -### Ollama v0.5.4 之后新增模型支持 +### Ollama v0.6.2 之后新增模型支持 -当前的 Ollama Portable Zip 基于 Ollama v0.5.4;此外,以下新模型也已在 Ollama Portable Zip 中得到支持: +当前的 Ollama Portable Zip 基于 Ollama v0.6.2;此外,以下新模型也已在 Ollama Portable Zip 中得到支持: | 模型 | 下载(Windows)| 下载(Linux)| 模型链接 | | - | - | - | - | diff --git a/docs/mddocs/Quickstart/ollama_quickstart.md b/docs/mddocs/Quickstart/ollama_quickstart.md index 920769ff..17509f21 100644 --- a/docs/mddocs/Quickstart/ollama_quickstart.md +++ b/docs/mddocs/Quickstart/ollama_quickstart.md @@ -12,9 +12,9 @@ > For installation on Intel Arc B-Series GPU (such as **B580**), please refer to this [guide](./bmg_quickstart.md). > [!NOTE] -> Our current version is consistent with [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) of ollama. +> Our current version is consistent with [v0.6.2](https://github.com/ollama/ollama/releases/tag/v0.6.2) of ollama. > -> `ipex-llm[cpp]==2.2.0b20250123` is consistent with [v0.5.1](https://github.com/ollama/ollama/releases/tag/v0.5.1) of ollama. +> `ipex-llm[cpp]==2.2.0b20250413` is consistent with [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) of ollama. See the demo of running LLaMA2-7B on Intel Arc GPU below. diff --git a/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md b/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md index e645e61f..13beca72 100644 --- a/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md +++ b/docs/mddocs/Quickstart/ollama_quickstart.zh-CN.md @@ -12,9 +12,9 @@ > 如果是在 Intel Arc B 系列 GPU 上安装(例如 **B580**),请参阅本[指南](./bmg_quickstart.md)。 > [!NOTE] -> `ipex-llm[cpp]` 的最新版本与官方 ollama 的 [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) 版本保持一致。 +> `ipex-llm[cpp]` 的最新版本与官方 ollama 的 [v0.6.2](https://github.com/ollama/ollama/releases/tag/v0.6.2) 版本保持一致。 > -> `ipex-llm[cpp]==2.2.0b20250123` 与官方 ollama 的 [v0.5.1](https://github.com/ollama/ollama/releases/tag/v0.5.1) 版本保持一致。 +> `ipex-llm[cpp]==2.2.0b20250413` 与官方 ollama 的 [v0.5.4](https://github.com/ollama/ollama/releases/tag/v0.5.4) 版本保持一致。 以下是在 Intel Arc GPU 上运行 LLaMA2-7B 的 DEMO 演示。