* Add initial hf huggingface GPU example * Small fix * Add llama3 gpu pytorch model example * Add llama 3 hf transformers CPU example * Add llama 3 pytorch model CPU example * Fixes * Small fix * Small fixes * Small fix * Small fix * Add links * update repo id * change prompt tuning url * remove system header if there is no system prompt --------- Co-authored-by: Yuwen Hu <yuwen.hu@intel.com> Co-authored-by: Yuwen Hu <54161268+Oscilloscope98@users.noreply.github.com>  | 
			||
|---|---|---|
| .. | ||
| aquila | ||
| aquila2 | ||
| baichuan | ||
| baichuan2 | ||
| bluelm | ||
| chatglm | ||
| chatglm2 | ||
| chatglm3 | ||
| codellama | ||
| codeshell | ||
| deciLM-7b | ||
| deepseek | ||
| deepseek-moe | ||
| distil-whisper | ||
| dolly_v1 | ||
| dolly_v2 | ||
| falcon | ||
| flan-t5 | ||
| fuyu | ||
| gemma | ||
| internlm | ||
| internlm-xcomposer | ||
| internlm2 | ||
| llama2 | ||
| llama3 | ||
| mistral | ||
| mixtral | ||
| moss | ||
| mpt | ||
| phi-1_5 | ||
| phi-2 | ||
| phixtral | ||
| phoenix | ||
| qwen | ||
| qwen-vl | ||
| qwen1.5 | ||
| redpajama | ||
| replit | ||
| skywork | ||
| solar | ||
| stablelm | ||
| starcoder | ||
| vicuna | ||
| whisper | ||
| wizardcoder-python | ||
| yi | ||
| yuan2 | ||
| ziya | ||
| README.md | ||
IPEX-LLM Transformers INT4 Optimization for Large Language Model
You can use IPEX-LLM to run any Huggingface Transformer models with INT4 optimizations on either servers or laptops. This directory contains example scripts to help you quickly get started using IPEX-LLM to run some popular open-source models in the community. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.
Recommended Requirements
To run the examples, we recommend using Intel® Xeon® processors (server), or >= 12th Gen Intel® Core™ processor (client).
For OS, IPEX-LLM supports Ubuntu 20.04 or later (glibc>=2.17), CentOS 7 or later (glibc>=2.17), and Windows 10/11.
Best Known Configuration on Linux
For better performance, it is recommended to set environment variables on Linux with the help of IPEX-LLM:
pip install ipex-llm
source ipex-llm-init