Limit trl version in example (#12332)

* Limit trl version in example

* Limit trl version in example
This commit is contained in:
Jin, Qiao 2024-11-05 14:50:10 +08:00 committed by GitHub
parent 923d696854
commit 82a61b5cf3
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
15 changed files with 29 additions and 29 deletions

View file

@ -19,7 +19,7 @@ conda activate llm
# install ipex-llm with 'all' option
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
pip install torchvision tiktoken transformers==4.42.4 trl
pip install torchvision tiktoken transformers==4.42.4 "trl<0.12.0"
```
On Windows:
@ -30,7 +30,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install torchvision tiktoken transformers==4.42.4 trl
pip install torchvision tiktoken transformers==4.42.4 "trl<0.12.0"
```
### 2. Run

View file

@ -18,7 +18,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
On Windows:
@ -29,7 +29,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
## 2. Run

View file

@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pyt
# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```
On Windows:
@ -31,7 +31,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```
### 2. Run

View file

@ -18,7 +18,7 @@ conda activate llm
# install ipex-llm with 'all' option
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
pip install torchvision==0.16.2 --index-url https://download.pytorch.org/whl/cpu
pip install transformers==4.40.0 trl
pip install transformers==4.40.0 "trl<0.12.0"
```
On Windows:
@ -28,7 +28,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install torchvision==0.16.2 --index-url https://download.pytorch.org/whl/cpu
pip install transformers==4.41.0 trl
pip install transformers==4.41.0 "trl<0.12.0"
```
### 2. Run

View file

@ -21,7 +21,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
On Windows:
@ -32,7 +32,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
### 2. Run

View file

@ -4,7 +4,7 @@ In this directory, you will find examples on how you could apply IPEX-LLM INT4 o
## Requirements
To run these examples with IPEX-LLM on Intel GPUs, we have some recommended requirements for your machine, please refer to [here](../../../README.md#requirements) for more information.
**Important: According to Gemma2's requirement, please make sure you have installed `transformers==4.43.1` and `trl` to run the example.**
**Important: According to Gemma2's requirement, please make sure you have installed `transformers==4.43.1` and `trl<0.12.0` to run the example.**
## Example: Predict Tokens using `generate()` API
In the example [generate.py](./generate.py), we show a basic use case for a Gemma2 model to predict the next N tokens using `generate()` API, with IPEX-LLM INT4 optimizations on Intel GPUs.
@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
# According to Gemma2's requirement, please make sure you are using a stable version of Transformers, 4.43.1 or newer.
pip install "transformers>=4.43.1"
pip install trl
pip install "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -33,7 +33,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
# According to Gemma2's requirement, please make sure you are using a stable version of Transformers, 4.43.1 or newer.
pip install "transformers>=4.43.1"
pip install trl
pip install "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux

View file

@ -14,7 +14,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
### 1.2 Installation on Windows
@ -27,7 +27,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
## 2. Configures OneAPI environment variables for Linux

View file

@ -17,7 +17,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -31,7 +31,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux

View file

@ -17,7 +17,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
pip install transformers==4.45.0
pip install accelerate==0.33.0
pip install trl
pip install "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -31,7 +31,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
pip install transformers==4.45.0
pip install accelerate==0.33.0
pip install trl
pip install "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux

View file

@ -15,7 +15,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.41.0 trl
pip install transformers==4.41.0 "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -27,7 +27,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.41.0 trl
pip install transformers==4.41.0 "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux

View file

@ -15,7 +15,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.40.0 trl
pip install transformers==4.40.0 "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -27,7 +27,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.40.0 trl
pip install transformers==4.40.0 "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux

View file

@ -15,7 +15,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install tiktoken transformers==4.42.4 trl
pip install tiktoken transformers==4.42.4 "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -27,7 +27,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install tiktoken transformers==4.42.4 trl
pip install tiktoken transformers==4.42.4 "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux

View file

@ -19,7 +19,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.36.0 datasets
pip install peft==0.10.0
pip install bitsandbytes scipy trl
pip install bitsandbytes scipy "trl<0.12.0"
```
### 2. Configures OneAPI environment variables

View file

@ -41,7 +41,7 @@ pip install gradio # for gradio web UI
conda install -c conda-forge -y gperftools=2.10 # to enable tcmalloc
# for glm-4v-9b
pip install transformers==4.42.4 trl
pip install transformers==4.42.4 "trl<0.12.0"
# for internlm-xcomposer2-vl-7b
pip install transformers==4.31.0

View file

@ -16,7 +16,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
#### 1.2 Installation on Windows
@ -29,7 +29,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```
### 2. Configures OneAPI environment variables for Linux