From 447c8ed324c3d3012e159abeac5353010d75438d Mon Sep 17 00:00:00 2001 From: Ch1y0q Date: Thu, 15 Aug 2024 16:40:48 +0800 Subject: [PATCH] =?UTF-8?q?update=20transformers=20version=20for=20`replit?= =?UTF-8?q?-code-v1-3b`,=20`internlm2-chat-=E2=80=A6=20(#11811)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * update transformers version for `replit-code-v1-3b`, `internlm2-chat-7b` and mistral * remove for default transformers version --- python/llm/example/GPU/HuggingFace/LLM/internlm2/README.md | 6 ++++-- python/llm/example/GPU/HuggingFace/LLM/mistral/README.md | 7 ------- python/llm/example/GPU/HuggingFace/LLM/replit/README.md | 4 +++- 3 files changed, 7 insertions(+), 10 deletions(-) diff --git a/python/llm/example/GPU/HuggingFace/LLM/internlm2/README.md b/python/llm/example/GPU/HuggingFace/LLM/internlm2/README.md index f8906fb2..5c4c1771 100644 --- a/python/llm/example/GPU/HuggingFace/LLM/internlm2/README.md +++ b/python/llm/example/GPU/HuggingFace/LLM/internlm2/README.md @@ -14,7 +14,8 @@ conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ -pip install transformers==3.36.2 +pip install transformers==3.38.0 +pip install einops pip install huggingface_hub ``` @@ -26,7 +27,8 @@ conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ -pip install transformers==3.36.2 +pip install transformers==3.38.0 +pip install einops pip install huggingface_hub ``` diff --git a/python/llm/example/GPU/HuggingFace/LLM/mistral/README.md b/python/llm/example/GPU/HuggingFace/LLM/mistral/README.md index 4de40cab..63542bcf 100644 --- a/python/llm/example/GPU/HuggingFace/LLM/mistral/README.md +++ b/python/llm/example/GPU/HuggingFace/LLM/mistral/README.md @@ -4,7 +4,6 @@ In this directory, you will find examples on how you could apply IPEX-LLM INT4 o ## Requirements To run these examples with IPEX-LLM on Intel GPUs, we have some recommended requirements for your machine, please refer to [here](../../../README.md#requirements) for more information. -**Important: According to [Mistral Troubleshooting](https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting), please make sure you have installed `transformers==4.34.0` to run the example.** ## Example: Predict Tokens using `generate()` API In the example [generate.py](./generate.py), we show a basic use case for a Mistral model to predict the next N tokens using `generate()` API, with IPEX-LLM INT4 optimizations on Intel GPUs. @@ -16,9 +15,6 @@ conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ - -# Refer to https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting, please make sure you are using a stable version of Transformers, 4.34.0 or newer. -pip install transformers==4.34.0 ``` #### 1.2 Installation on Windows @@ -29,9 +25,6 @@ conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ - -# Refer to https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting, please make sure you are using a stable version of Transformers, 4.34.0 or newer. -pip install transformers==4.34.0 ``` ### 2. Configures OneAPI environment variables for Linux diff --git a/python/llm/example/GPU/HuggingFace/LLM/replit/README.md b/python/llm/example/GPU/HuggingFace/LLM/replit/README.md index 7c12b977..644de85a 100644 --- a/python/llm/example/GPU/HuggingFace/LLM/replit/README.md +++ b/python/llm/example/GPU/HuggingFace/LLM/replit/README.md @@ -15,7 +15,7 @@ conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ -pip install "transformers<4.35" +pip install transformers<=4.33.3 ``` #### 1.2 Installation on Windows @@ -26,6 +26,8 @@ conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + +pip install transformers<=4.33.3 ``` ### 2. Configures OneAPI environment variables for Linux