diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md index 87efda38..0e47ebed 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md @@ -25,6 +25,7 @@ conda activate llm # install the latest ipex-llm nightly build with 'all' option pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu pip install einops # additional package required for phi-2 to conduct generation +pip install transformers==4.37.0 ``` On Windows: @@ -34,6 +35,7 @@ conda activate llm pip install --pre --upgrade ipex-llm[all] pip install einops +pip install transformers==4.37.0 ``` ### 2. Run diff --git a/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md b/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md index 0ce86773..515dc5e8 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md @@ -20,6 +20,7 @@ conda activate llm # install the latest ipex-llm nightly build with 'all' option pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu pip install einops +pip install transformers==4.37.0 ``` On Windows: @@ -30,6 +31,7 @@ conda activate llm pip install --pre --upgrade ipex-llm[all] pip install einops +pip install transformers==4.37.0 ``` ### 2. Run diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md index d8c37adb..87789eb0 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md @@ -16,6 +16,7 @@ conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ pip install einops # additional package required for phi-2 to conduct generation +pip install transformers==4.37.0 ``` #### 1.2 Installation on Windows @@ -28,6 +29,7 @@ conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ pip install einops # additional package required for phi-2 to conduct generation +pip install transformers==4.37.0 ``` ### 2. Configures OneAPI environment variables for Linux diff --git a/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md index 0ae7e51b..bbd276b9 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md @@ -16,6 +16,7 @@ conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ pip install einops # additional package required for phi-2 to conduct generation +pip install transformers==4.37.0 ``` #### 1.2 Installation on Windows @@ -26,6 +27,7 @@ conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ +pip install transformers==4.37.0 ``` ### 2. Configures OneAPI environment variables for Linux