upgrade OneAPI version for cpp Windows (#12063)

* update version

* update quickstart
This commit is contained in:
Ruonan Wang 2024-09-11 20:12:12 -07:00 committed by GitHub
parent e78e45ee01
commit 48d9092b5a
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
3 changed files with 11 additions and 2 deletions

View file

@ -18,6 +18,9 @@ See the demo of running LLaMA2-7B on Intel Arc GPU below.
>
> Our latest version is consistent with [a1631e5](https://github.com/ggerganov/llama.cpp/commit/a1631e53f6763e17da522ba219b030d8932900bd) of llama.cpp.
> [!NOTE]
> Starting from `ipex-llm[cpp]==2.2.0b20240912`, oneAPI dependency of `ipex-llm[cpp]` on Windows will switch from `2024.0.0` to `2024.2.1` .
## Table of Contents
- [Prerequisites](./llama_cpp_quickstart.md#0-prerequisites)
- [Install IPEX-LLM for llama.cpp](./llama_cpp_quickstart.md#1-install-ipex-llm-for-llamacpp)

View file

@ -18,6 +18,9 @@ See the demo of running LLaMA2-7B on Intel Arc GPU below.
>
> Our current version is consistent with [v0.3.6](https://github.com/ollama/ollama/releases/tag/v0.3.6) of ollama.
> [!NOTE]
> Starting from `ipex-llm[cpp]==2.2.0b20240912`, oneAPI dependency of `ipex-llm[cpp]` on Windows will switch from `2024.0.0` to `2024.2.1` .
## Table of Contents
- [Install IPEX-LLM for Ollama](./ollama_quickstart.md#1-install-ipex-llm-for-ollama)
- [Initialize Ollama](./ollama_quickstart.md#2-initialize-ollama)

View file

@ -277,6 +277,9 @@ def setup_package():
oneapi_2024_0_requires = ["dpcpp-cpp-rt==2024.0.2;platform_system=='Windows'",
"mkl-dpcpp==2024.0.0;platform_system=='Windows'",
"onednn==2024.0.0;platform_system=='Windows'"]
oneapi_2024_2_requires = ["dpcpp-cpp-rt==2024.2.1;platform_system=='Windows'",
"mkl-dpcpp==2024.2.1;platform_system=='Windows'",
"onednn==2024.2.1;platform_system=='Windows'"]
# Linux install with --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
xpu_21_requires = copy.deepcopy(all_requires)
for exclude_require in cpu_torch_version:
@ -294,8 +297,8 @@ def setup_package():
cpp_requires = ["bigdl-core-cpp==" + CORE_XE_VERSION,
"onednn-devel==2024.0.0;platform_system=='Windows'"]
cpp_requires += oneapi_2024_0_requires
"onednn-devel==2024.2.1;platform_system=='Windows'"]
cpp_requires += oneapi_2024_2_requires
serving_requires = ['py-cpuinfo']
serving_requires += SERVING_DEP