Update llm gpu xpu default related info to PyTorch 2.1 (#9866)

This commit is contained in:
Yuwen Hu 2024-01-09 15:38:47 +08:00 committed by GitHub
parent a3725b0816
commit 23fc888abe
61 changed files with 67 additions and 140 deletions

View file

@ -13,8 +13,7 @@ To run this example with BigDL-LLM on Intel GPUs, we have some recommended requi
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install --pre --upgrade bigdl-llm[xpu_2.1] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install oneccl_bind_pt==2.1.100 -f https://developer.intel.com/ipex-whl-stable-xpu pip install oneccl_bind_pt==2.1.100 -f https://developer.intel.com/ipex-whl-stable-xpu
# configures OneAPI environment variables # configures OneAPI environment variables
source /opt/intel/oneapi/setvars.sh source /opt/intel/oneapi/setvars.sh

View file

@ -30,7 +30,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.35.0 pip install transformers==4.35.0
pip install autoawq==0.1.8 --no-deps pip install autoawq==0.1.8 --no-deps

View file

@ -26,8 +26,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.34.0 # upgrade transformers pip install transformers==4.34.0 # upgrade transformers
``` ```

View file

@ -11,7 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.34.0 pip install transformers==4.34.0
BUILD_CUDA_EXT=0 pip install git+https://github.com/PanQiWei/AutoGPTQ.git@1de9ab6 BUILD_CUDA_EXT=0 pip install git+https://github.com/PanQiWei/AutoGPTQ.git@1de9ab6

View file

@ -17,8 +17,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -17,8 +17,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers_stream_generator # additional package required for Baichuan-13B-Chat to conduct generation pip install transformers_stream_generator # additional package required for Baichuan-13B-Chat to conduct generation
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers_stream_generator # additional package required for Baichuan-7B-Chat to conduct generation pip install transformers_stream_generator # additional package required for Baichuan-7B-Chat to conduct generation
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -12,8 +12,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
@ -73,8 +72,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -12,8 +12,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
@ -74,8 +73,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.34.1 # CodeLlamaTokenizer is supported in higher version of transformers pip install transformers==4.34.1 # CodeLlamaTokenizer is supported in higher version of transformers
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install datasets soundfile librosa # required by audio processing pip install datasets soundfile librosa # required by audio processing
``` ```

View file

@ -13,8 +13,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -12,9 +12,8 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install einops # additional package required for falcon-7b-instruct to conduct generation pip install einops # additional package required for falcon-7b-instruct to conduct generation
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -16,8 +16,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
# Refer to https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting, please make sure you are using a stable version of Transformers, 4.34.0 or newer. # Refer to https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting, please make sure you are using a stable version of Transformers, 4.34.0 or newer.

View file

@ -16,8 +16,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
# Please make sure you are using a stable version of Transformers, 4.36.0 or newer. # Please make sure you are using a stable version of Transformers, 4.36.0 or newer.

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install einops # additional package required for mpt-7b-chat and mpt-30b-chat to conduct generation pip install einops # additional package required for mpt-7b-chat and mpt-30b-chat to conduct generation
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install einops # additional package required for phi-1_5 to conduct generation pip install einops # additional package required for phi-1_5 to conduct generation
``` ```

View file

@ -13,8 +13,7 @@ After installing conda, create a Python environment for BigDL-LLM:
```bash ```bash
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install accelerate tiktoken einops transformers_stream_generator==0.0.4 scipy torchvision pillow tensorboard matplotlib # additional package required for Qwen-VL-Chat to conduct generation pip install accelerate tiktoken einops transformers_stream_generator==0.0.4 scipy torchvision pillow tensorboard matplotlib # additional package required for Qwen-VL-Chat to conduct generation
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install tiktoken einops transformers_stream_generator # additional package required for Qwen-7B-Chat to conduct generation pip install tiktoken einops transformers_stream_generator # additional package required for Qwen-7B-Chat to conduct generation
``` ```

View file

@ -13,8 +13,7 @@ After installing conda, create a Python environment for BigDL-LLM:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.35.2 # required by SOLAR-10.7B pip install transformers==4.35.2 # required by SOLAR-10.7B
``` ```

View file

@ -11,8 +11,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -13,8 +13,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
### 2. Configures OneAPI environment variables ### 2. Configures OneAPI environment variables

View file

@ -13,8 +13,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install librosa soundfile datasets pip install librosa soundfile datasets
pip install accelerate pip install accelerate

View file

@ -12,8 +12,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install datasets soundfile librosa # required by audio processing pip install datasets soundfile librosa # required by audio processing
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install einops # additional package required for Yi-6B to conduct generation pip install einops # additional package required for Yi-6B to conduct generation
``` ```

View file

@ -8,8 +8,7 @@ We suggest using conda to manage environment:
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers_stream_generator # additional package required for Baichuan-13B-Chat to conduct generation pip install transformers_stream_generator # additional package required for Baichuan-13B-Chat to conduct generation
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers_stream_generator # additional package required for Baichuan2-7B-Chat to conduct generation pip install transformers_stream_generator # additional package required for Baichuan2-7B-Chat to conduct generation
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
@ -73,8 +72,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```
@ -72,8 +71,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.34.1 # CodeLlamaTokenizer is supported in higher version of transformers pip install transformers==4.34.1 # CodeLlamaTokenizer is supported in higher version of transformers
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install datasets soundfile librosa # required by audio processing pip install datasets soundfile librosa # required by audio processing
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
git clone -b v1.1.1 --depth=1 https://github.com/haotian-liu/LLaVA.git # clone the llava libary git clone -b v1.1.1 --depth=1 https://github.com/haotian-liu/LLaVA.git # clone the llava libary

View file

@ -16,8 +16,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
# Refer to https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting, please make sure you are using a stable version of Transformers, 4.34.0 or newer. # Refer to https://huggingface.co/mistralai/Mistral-7B-v0.1#troubleshooting, please make sure you are using a stable version of Transformers, 4.34.0 or newer.

View file

@ -16,8 +16,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
# Please make sure you are using a stable version of Transformers, 4.36.0 or newer. # Please make sure you are using a stable version of Transformers, 4.36.0 or newer.

View file

@ -13,8 +13,7 @@ After installing conda, create a Python environment for BigDL-LLM:
```bash ```bash
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install accelerate tiktoken einops transformers_stream_generator==0.0.4 scipy torchvision pillow tensorboard matplotlib # additional package required for Qwen-VL-Chat to conduct generation pip install accelerate tiktoken einops transformers_stream_generator==0.0.4 scipy torchvision pillow tensorboard matplotlib # additional package required for Qwen-VL-Chat to conduct generation
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install transformers==4.35.2 # required by SOLAR-10.7B pip install transformers==4.35.2 # required by SOLAR-10.7B
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ After installing conda, create a Python environment for BigDL-LLM:
conda create -n llm python=3.9 # recommend to use Python 3.9 conda create -n llm python=3.9 # recommend to use Python 3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install einops # additional package required for Yi-6B to conduct generation pip install einops # additional package required for Yi-6B to conduct generation
``` ```

View file

@ -12,8 +12,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -12,8 +12,7 @@ We suggest using conda to manage environment:
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
``` ```

View file

@ -14,8 +14,7 @@ This example is ported from [bnb-4bit-training](https://colab.research.google.co
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install datasets transformers==4.34.0 pip install datasets transformers==4.34.0
pip install peft==0.5.0 pip install peft==0.5.0

View file

@ -10,12 +10,11 @@ To run this example with BigDL-LLM on Intel GPUs, we have some recommended requi
```bash ```bash
conda create -n llm python=3.9 conda create -n llm python=3.9
conda activate llm conda activate llm
# below command will install intel_extension_for_pytorch==2.0.110+xpu as default # below command will install intel_extension_for_pytorch==2.1.10+xpu as default
# you can install specific ipex/torch version for your need
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
pip install datasets transformers==4.34.0 pip install datasets transformers==4.34.0
pip install fire peft==0.5.0 pip install fire peft==0.5.0
pip install oneccl_bind_pt==2.0.100 -f https://developer.intel.com/ipex-whl-stable-xpu # necessary to run distributed finetuning pip install oneccl_bind_pt==2.1.100 -f https://developer.intel.com/ipex-whl-stable-xpu # necessary to run distributed finetuning
pip install accelerate==0.23.0 pip install accelerate==0.23.0
pip install bitsandbytes scipy pip install bitsandbytes scipy
``` ```

View file

@ -19,17 +19,4 @@ This folder contains examples of running BigDL-LLM on Intel GPU:
- Ubuntu 20.04 or later (Ubuntu 22.04 is preferred) - Ubuntu 20.04 or later (Ubuntu 22.04 is preferred)
## Requirements ## Requirements
To apply Intel GPU acceleration, therere several steps for tools installation and environment preparation. See the [GPU installation guide](https://bigdl.readthedocs.io/en/latest/doc/LLM/Overview/install_gpu.html) for mode details. To apply Intel GPU acceleration, therere several steps for tools installation and environment preparation. See the [GPU installation guide](https://bigdl.readthedocs.io/en/latest/doc/LLM/Overview/install_gpu.html) for mode details.
Step 1, please refer to our [driver installation](https://dgpu-docs.intel.com/driver/installation.html) for general purpose GPU capabilities.
> **Note**: IPEX 2.0.110+xpu requires Intel GPU Driver version is [Stable 647.21](https://dgpu-docs.intel.com/releases/stable_647_21_20230714.html).
Step 2, you also need to download and install [Intel® oneAPI Base Toolkit](https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.html). OneMKL and DPC++ compiler are needed, others are optional.
> **Note**: IPEX 2.0.110+xpu requires Intel® oneAPI Base Toolkit's version == 2023.2.0.
## Best Known Configuration on Linux
For better performance, it is recommended to set environment variables on Linux:
```bash
export USE_XETLA=OFF
export SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
```

View file

@ -38,6 +38,7 @@ pip3 install psutil
pip3 install sentencepiece # Required for LLaMA tokenizer. pip3 install sentencepiece # Required for LLaMA tokenizer.
pip3 install numpy pip3 install numpy
pip3 install "transformers>=4.33.1" # Required for Code Llama. pip3 install "transformers>=4.33.1" # Required for Code Llama.
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade "bigdl-llm[xpu]" -f https://developer.intel.com/ipex-whl-stable-xpu pip install --pre --upgrade "bigdl-llm[xpu]" -f https://developer.intel.com/ipex-whl-stable-xpu
pip3 install fastapi pip3 install fastapi
pip3 install "uvicorn[standard]" pip3 install "uvicorn[standard]"