LLM: Fix speculative llama3 long input error (#10934)

This commit is contained in:
Wang, Jian4 2024-05-07 09:25:20 +08:00 committed by GitHub
parent 49ab5a2b0e
commit 1de878bee1
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -18,7 +18,8 @@ We suggest using conda to manage environment:
conda create -n llm python=3.11 conda create -n llm python=3.11
conda activate llm conda activate llm
pip install --pre --upgrade ipex-llm[all] pip install --pre --upgrade ipex-llm[all]
pip install intel_extension_for_pytorch==2.1.0 # transformers>=4.33.0 is required for Llama3 with IPEX-LLM optimizations
pip install transformers==4.37.0
``` ```
### 2. Configures high-performing processor environment variables ### 2. Configures high-performing processor environment variables