ipex-llm/python/llm/example/GPU/HuggingFace/LLM
Jinhe adfbb9124a
Reorganize MiniCPM-V-2_6 example & update others MiniCPM-V-2 exmaples (#11815)
* model to fp16 & 2_6 reorganize

* revisions

* revisions

* half

* deleted transformer version requirements

* deleted transformer version requirements

---------

Co-authored-by: ivy-lv11 <zhicunlv@gmail.com>
2024-08-16 14:48:56 +08:00
..
aquila Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
aquila2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
baichuan Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
baichuan2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
bluelm Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
chatglm2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
chatglm3 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
chinese-llama2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
codegeex2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
codegemma fix gemma for 4.41 (#11531) 2024-07-18 15:02:50 -07:00
codellama Reorganize MiniCPM-V-2_6 example & update others MiniCPM-V-2 exmaples (#11815) 2024-08-16 14:48:56 +08:00
codeshell fix: add run oneAPI instruction for the example of codeshell (#11828) 2024-08-16 14:29:06 +08:00
cohere Fix cohere model on transformers>=4.41 (#11575) 2024-07-17 17:18:59 -07:00
deciLM-7b Reorganize MiniCPM-V-2_6 example & update others MiniCPM-V-2 exmaples (#11815) 2024-08-16 14:48:56 +08:00
deepseek Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
dolly-v1 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
dolly-v2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
falcon Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
flan-t5 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
gemma fix gemma for 4.41 (#11531) 2024-07-18 15:02:50 -07:00
gemma2 add gemma2 example (#11724) 2024-08-06 21:17:50 +08:00
glm4 Upgrade glm-4 example transformers version (#11659) 2024-07-31 10:24:50 +08:00
gpt-j Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
internlm Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
internlm2 Reorganize MiniCPM-V-2_6 example & update others MiniCPM-V-2 exmaples (#11815) 2024-08-16 14:48:56 +08:00
llama2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
llama3 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
llama3.1 Add Llama3.1 example (#11689) 2024-07-31 10:53:30 +08:00
minicpm fix minicpm for transformers>=4.39 (#11533) 2024-07-18 15:01:57 -07:00
mistral update transformers version for replit-code-v1-3b, `internlm2-chat-… (#11811) 2024-08-15 16:40:48 +08:00
mixtral Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
mpt Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
phi-1_5 phi model readme (#11595) 2024-07-17 17:18:34 -07:00
phi-2 phi model readme (#11595) 2024-07-17 17:18:34 -07:00
phi-3 phi-3 on "transformers>=4.37.0,<=4.42.3" (#11534) 2024-07-17 17:19:57 -07:00
phixtral Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
qwen Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
qwen1.5 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
qwen2 optimize lookahead init time (#11769) 2024-08-12 17:19:12 +08:00
redpajama Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
replit update transformers version for replit-code-v1-3b, `internlm2-chat-… (#11811) 2024-08-15 16:40:48 +08:00
rwkv4 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
rwkv5 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
solar deletion of specification of transformers version (#11808) 2024-08-15 15:23:32 +08:00
stablelm Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
starcoder Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
vicuna transformers==4.37, yi & yuan2 & vicuna (#11805) 2024-08-15 15:39:24 +08:00
yi transformers==4.37, yi & yuan2 & vicuna (#11805) 2024-08-15 15:39:24 +08:00
yuan2 Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
README.md Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00

IPEX-LLM Transformers INT4 Optimization for Large Language Model on Intel GPUs

You can use IPEX-LLM to run almost every Huggingface Transformer models with INT4 optimizations on your laptops with Intel GPUs. This directory contains example scripts to help you quickly get started using IPEX-LLM to run some popular open-source models in the community. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.