parent
d830a63bb7
commit
842d6dfc2d
15 changed files with 15 additions and 15 deletions
|
|
@ -4,7 +4,7 @@ This folder contains examples showcasing how to use `langchain` with `ipex-llm`.
|
|||
|
||||
### Install-IPEX LLM
|
||||
|
||||
Ensure `ipex-llm` is installed by following the [IPEX-LLM Installation Guide](https://github.com/intel-analytics/ipex-llm/tree/main/python/llm#install).
|
||||
Ensure `ipex-llm` is installed by following the [IPEX-LLM Installation Guide](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Overview/install_cpu.html).
|
||||
|
||||
### Install Dependences Required by the Examples
|
||||
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ This example is ported from [bnb-4bit-training](https://colab.research.google.co
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install transformers==4.36.0
|
||||
pip install peft==0.10.0
|
||||
pip install datasets
|
||||
|
|
|
|||
|
|
@ -27,6 +27,6 @@ This folder contains examples of running IPEX-LLM on Intel CPU:
|
|||
## Best Known Configuration on Linux
|
||||
For better performance, it is recommended to set environment variables on Linux with the help of IPEX-LLM:
|
||||
```bash
|
||||
pip install ipex-llm
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
source ipex-llm-init
|
||||
```
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ You can use IPEX-LLM to run BF16 inference for any Huggingface Transformer model
|
|||
To run these examples with IPEX-LLM, we have some recommended requirements for your machine, please refer to [here](../README.md#system-support) for more information. Make sure you have installed `ipex-llm` before:
|
||||
|
||||
```bash
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
```
|
||||
|
||||
Moreover, install IPEX 2.1.0, which can be done through `pip install intel_extension_for_pytorch==2.1.0`.
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install intel_extension_for_pytorch==2.1.0
|
||||
pip install transformers==4.31.0
|
||||
```
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
```
|
||||
### 2. Configures OneAPI environment variables
|
||||
```bash
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install intel_extension_for_pytorch==2.1.0
|
||||
```
|
||||
### 2. Configures high-performing processor environment variables
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
# transformers>=4.33.0 is required for Llama3 with IPEX-LLM optimizations
|
||||
pip install transformers==4.37.0
|
||||
```
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install intel_extension_for_pytorch==2.1.0
|
||||
pip install transformers==4.35.2
|
||||
```
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install transformers==4.36.0
|
||||
```
|
||||
### 2. Configures high-performing processor environment variables
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install tiktoken einops transformers_stream_generator # additional package required for Qwen to conduct generation
|
||||
```
|
||||
### 2. Configures environment variables
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install intel_extension_for_pytorch==2.1.0
|
||||
pip install transformers==4.31.0
|
||||
```
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install intel_extension_for_pytorch==2.1.0
|
||||
```
|
||||
### 2. Configures high-performing processor environment variables
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ We suggest using conda to manage environment:
|
|||
```bash
|
||||
conda create -n llm python=3.11
|
||||
conda activate llm
|
||||
pip install --pre --upgrade ipex-llm[all]
|
||||
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip install intel_extension_for_pytorch==2.1.0
|
||||
pip install transformers==4.35.2
|
||||
```
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ conda create -n ipex-vllm python=3.11
|
|||
conda activate ipex-vllm
|
||||
# Install dependencies
|
||||
pip3 install numpy
|
||||
pip3 install --pre --upgrade ipex-llm[all]
|
||||
pip3 install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
|
||||
pip3 install psutil
|
||||
pip3 install sentencepiece # Required for LLaMA tokenizer.
|
||||
pip3 install fastapi
|
||||
|
|
|
|||
Loading…
Reference in a new issue