ipex-llm/python/llm/example/CPU/HF-Transformers-AutoModels/Model
Guoqiong Song d64711900a
Fix cohere model on transformers>=4.41 (#11575)
* fix cohere model for 4-41
2024-07-17 17:18:59 -07:00
..
aquila Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
aquila2 Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
baichuan LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
baichuan2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
bluelm LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
chatglm Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
chatglm2 ChatGLM Examples Restructure regarding Installation Steps (#11285) 2024-06-14 12:37:05 +08:00
chatglm3 ChatGLM Examples Restructure regarding Installation Steps (#11285) 2024-06-14 12:37:05 +08:00
codegeex2 Fix codegeex2 transformers version (#11487) 2024-07-02 15:09:28 +08:00
codegemma Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
codellama LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
codeshell Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
cohere Fix cohere model on transformers>=4.41 (#11575) 2024-07-17 17:18:59 -07:00
deciLM-7b LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
deepseek LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
deepseek-moe Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
distil-whisper Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
dolly_v1 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
dolly_v2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
falcon LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
flan-t5 Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
fuyu Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
gemma LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
glm-4v Add GLM-4V example (#11343) 2024-06-21 12:54:31 +08:00
glm4 ChatGLM Examples Restructure regarding Installation Steps (#11285) 2024-06-14 12:37:05 +08:00
internlm LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
internlm-xcomposer Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
internlm2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
llama2 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
llama3 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
minicpm Add CPU and GPU example for MiniCPM (#11202) 2024-06-05 18:09:53 +08:00
mistral Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
mixtral Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
moss LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
mpt LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
phi-1_5 phi model readme (#11595) 2024-07-17 17:18:34 -07:00
phi-2 phi model readme (#11595) 2024-07-17 17:18:34 -07:00
phi-3 Add CPU and GPU example for MiniCPM (#11202) 2024-06-05 18:09:53 +08:00
phi-3-vision Add phi-3-vision example (#11156) 2024-05-30 10:02:47 +08:00
phixtral Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
phoenix LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
qwen LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
qwen-vl Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
qwen1.5 LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
qwen2 Update sample output for HF Qwen2 GPU and CPU (#11257) 2024-06-07 11:36:22 +08:00
redpajama LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
replit Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
skywork LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
solar LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
stablelm Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
starcoder LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
vicuna LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
whisper LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
wizardcoder-python LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
yi Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
yuan2 Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
ziya Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
README.md LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00

IPEX-LLM Transformers INT4 Optimization for Large Language Model

You can use IPEX-LLM to run any Huggingface Transformer models with INT4 optimizations on either servers or laptops. This directory contains example scripts to help you quickly get started using IPEX-LLM to run some popular open-source models in the community. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.

To run the examples, we recommend using Intel® Xeon® processors (server), or >= 12th Gen Intel® Core™ processor (client).

For OS, IPEX-LLM supports Ubuntu 20.04 or later (glibc>=2.17), CentOS 7 or later (glibc>=2.17), and Windows 10/11.

Best Known Configuration on Linux

For better performance, it is recommended to set environment variables on Linux with the help of IPEX-LLM:

pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
source ipex-llm-init