ipex-llm/python/llm/example/CPU/HF-Transformers-AutoModels/Model
Jin Qiao 10ee786920
Replace with IPEX-LLM in example comments (#10671)
* Replace with IPEX-LLM in example comments

* More replacement

* revert some changes
2024-04-07 13:29:51 +08:00
..
aquila Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
aquila2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
baichuan Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
baichuan2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
bluelm Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
chatglm Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
chatglm2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
chatglm3 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
codellama Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
codeshell Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
deciLM-7b Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00
deepseek Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
deepseek-moe Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00
distil-whisper Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00
dolly_v1 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
dolly_v2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
falcon Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
flan-t5 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
fuyu Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
gemma Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
internlm Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
internlm-xcomposer Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
internlm2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
llama2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
mistral Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
mixtral Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
moss Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
mpt Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
phi-1_5 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
phi-2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
phixtral Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
phoenix Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
qwen Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
qwen-vl Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
qwen1.5 Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00
redpajama Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
replit Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
skywork Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
solar Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
stablelm Add GPU and CPU example for stablelm-zephyr-3b (#10643) 2024-04-03 16:28:31 +08:00
starcoder Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
vicuna Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
whisper Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
wizardcoder-python Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
yi Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
yuan2 Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
ziya Replace with IPEX-LLM in example comments (#10671) 2024-04-07 13:29:51 +08:00
README.md Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00

IPEX-LLM Transformers INT4 Optimization for Large Language Model

You can use IPEX-LLM to run any Huggingface Transformer models with INT4 optimizations on either servers or laptops. This directory contains example scripts to help you quickly get started using IPEX-LLM to run some popular open-source models in the community. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.

To run the examples, we recommend using Intel® Xeon® processors (server), or >= 12th Gen Intel® Core™ processor (client).

For OS, IPEX-LLM supports Ubuntu 20.04 or later (glibc>=2.17), CentOS 7 or later (glibc>=2.17), and Windows 10/11.

Best Known Configuration on Linux

For better performance, it is recommended to set environment variables on Linux with the help of IPEX-LLM:

pip install ipex-llm
source ipex-llm-init