ipex-llm/python/llm/example/CPU/HF-Transformers-AutoModels/Model
Jin, Qiao 82a61b5cf3
Limit trl version in example (#12332)
* Limit trl version in example

* Limit trl version in example
2024-11-05 14:50:10 +08:00
..
aquila Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
aquila2 Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
baichuan LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
baichuan2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
bluelm LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
chatglm Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
chatglm2 ChatGLM Examples Restructure regarding Installation Steps (#11285) 2024-06-14 12:37:05 +08:00
chatglm3 ChatGLM Examples Restructure regarding Installation Steps (#11285) 2024-06-14 12:37:05 +08:00
codegeex2 Fix codegeex2 transformers version (#11487) 2024-07-02 15:09:28 +08:00
codegemma fix gemma for 4.41 (#11531) 2024-07-18 15:02:50 -07:00
codellama LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
codeshell Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
cohere Fix cohere model on transformers>=4.41 (#11575) 2024-07-17 17:18:59 -07:00
deciLM-7b LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
deepseek LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
deepseek-moe Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
distil-whisper Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
dolly_v1 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
dolly_v2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
falcon refactor ot remove old rope usage (#12224) 2024-10-17 17:06:09 +08:00
flan-t5 Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
fuyu Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
gemma fix gemma for 4.41 (#11531) 2024-07-18 15:02:50 -07:00
glm-4v Limit trl version in example (#12332) 2024-11-05 14:50:10 +08:00
glm4 Limit trl version in example (#12332) 2024-11-05 14:50:10 +08:00
internlm LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
internlm-xcomposer Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
internlm2 fix 1482 (#11661) 2024-07-26 12:39:09 -07:00
llama2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
llama3 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
llama3.1 Limit trl version in example (#12332) 2024-11-05 14:50:10 +08:00
minicpm fix minicpm for transformers>=4.39 (#11533) 2024-07-18 15:01:57 -07:00
minicpm-v-2 added minicpm cpu examples (#12027) 2024-09-11 15:51:21 +08:00
minicpm-v-2_6 Limit trl version in example (#12332) 2024-11-05 14:50:10 +08:00
mistral Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
mixtral Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
moss LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
mpt LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
phi-1_5 phi model readme (#11595) 2024-07-17 17:18:34 -07:00
phi-2 phi model readme (#11595) 2024-07-17 17:18:34 -07:00
phi-3 phi-3 on "transformers>=4.37.0,<=4.42.3" (#11534) 2024-07-17 17:19:57 -07:00
phi-3-vision phi-3 on "transformers>=4.37.0,<=4.42.3" (#11534) 2024-07-17 17:19:57 -07:00
phixtral Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
phoenix LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
qwen Update ipex-llm default transformers version to 4.37.0 (#11859) 2024-08-20 17:37:58 +08:00
qwen-vl Update ipex-llm default transformers version to 4.37.0 (#11859) 2024-08-20 17:37:58 +08:00
qwen1.5 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
qwen2 Update sample output for HF Qwen2 GPU and CPU (#11257) 2024-06-07 11:36:22 +08:00
redpajama LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
replit Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
skywork LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
solar LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
stablelm Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
starcoder LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
vicuna LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
whisper LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
wizardcoder-python LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
yi Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
yuan2 Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
ziya Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
README.md LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00

IPEX-LLM Transformers INT4 Optimization for Large Language Model

You can use IPEX-LLM to run any Huggingface Transformer models with INT4 optimizations on either servers or laptops. This directory contains example scripts to help you quickly get started using IPEX-LLM to run some popular open-source models in the community. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.

To run the examples, we recommend using Intel® Xeon® processors (server), or >= 12th Gen Intel® Core™ processor (client).

For OS, IPEX-LLM supports Ubuntu 20.04 or later (glibc>=2.17), CentOS 7 or later (glibc>=2.17), and Windows 10/11.

Best Known Configuration on Linux

For better performance, it is recommended to set environment variables on Linux with the help of IPEX-LLM:

pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
source ipex-llm-init