ipex-llm/python/llm/example/CPU/PyTorch-Models/Model
2024-04-09 13:47:07 -07:00
..
aquila2 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
bark Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
bert Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
bluelm Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
chatglm Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
chatglm3 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
codellama Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
codeshell Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
deciLM-7b Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
deepseek Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
deepseek-moe Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
distil-whisper Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
flan-t5 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
fuyu Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
internlm-xcomposer Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
internlm2 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
llama2 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
llava Fix llava example to support transformerds 4.36 (#10614) 2024-04-09 13:47:07 -07:00
mamba Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
meta-llama Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
mistral Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
mixtral Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
openai-whisper Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
phi-1_5 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
phi-2 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
phixtral Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
qwen-vl Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
qwen1.5 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
skywork Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
solar Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
stablelm Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
wizardcoder-python Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
yi Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
yuan2 Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
ziya Upgrade to python 3.11 (#10711) 2024-04-09 17:41:17 +08:00
README.md Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00

IPEX-LLM INT4 Optimization for Large Language Model

You can use optimize_model API to accelerate general PyTorch models on Intel servers and PCs. This directory contains example scripts to help you quickly get started using IPEX-LLM to run some popular open-source models in the community. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.

To run the examples, we recommend using Intel® Xeon® processors (server), or >= 12th Gen Intel® Core™ processor (client).

For OS, IPEX-LLM supports Ubuntu 20.04 or later, CentOS 7 or later, and Windows 10/11.

Best Known Configuration on Linux

For better performance, it is recommended to set environment variables on Linux with the help of IPEX-LLM:

pip install ipex-llm
source ipex-llm-init