..
aquila
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
aquila2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
baichuan
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
baichuan2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
bluelm
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
chatglm2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
chatglm3
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
chinese-llama2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
codegeex2
Codegeex2 tokenization fix ( #11831 )
2024-08-16 15:48:47 +08:00
codegemma
fix gemma for 4.41 ( #11531 )
2024-07-18 15:02:50 -07:00
codellama
change 5 pytorch/huggingface models to fp16 ( #11894 )
2024-08-22 16:12:09 +08:00
codeshell
fix: add run oneAPI instruction for the example of codeshell ( #11828 )
2024-08-16 14:29:06 +08:00
cohere
Fix cohere model on transformers>=4.41 ( #11575 )
2024-07-17 17:18:59 -07:00
deciLM-7b
Reorganize MiniCPM-V-2_6 example & update others MiniCPM-V-2 exmaples ( #11815 )
2024-08-16 14:48:56 +08:00
deepseek
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
dolly-v1
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
dolly-v2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
falcon
refactor ot remove old rope usage ( #12224 )
2024-10-17 17:06:09 +08:00
flan-t5
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
gemma
fix gemma for 4.41 ( #11531 )
2024-07-18 15:02:50 -07:00
gemma2
add gemma2 example ( #11724 )
2024-08-06 21:17:50 +08:00
glm4
Upgrade glm-4 example transformers version ( #11659 )
2024-07-31 10:24:50 +08:00
gpt-j
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
internlm
change 5 pytorch/huggingface models to fp16 ( #11894 )
2024-08-22 16:12:09 +08:00
internlm2
delete transformers version requirement ( #11845 )
2024-08-19 17:53:02 +08:00
llama2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
llama3
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
llama3.1
add llama3.2 GPU example ( #12137 )
2024-09-29 14:41:54 +08:00
llama3.2
add llama3.2 GPU example ( #12137 )
2024-09-29 14:41:54 +08:00
minicpm
fix minicpm for transformers>=4.39 ( #11533 )
2024-07-18 15:01:57 -07:00
minicpm3
Add minicpm3 gpu example ( #12114 )
2024-09-26 13:51:37 +08:00
mistral
update transformers version for replit-code-v1-3b, `internlm2-chat-… ( #11811 )
2024-08-15 16:40:48 +08:00
mixtral
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
mpt
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
phi-1_5
phi model readme ( #11595 )
2024-07-17 17:18:34 -07:00
phi-2
phi model readme ( #11595 )
2024-07-17 17:18:34 -07:00
phi-3
phi-3 on "transformers>=4.37.0,<=4.42.3" ( #11534 )
2024-07-17 17:19:57 -07:00
phixtral
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
qwen
Update ipex-llm default transformers version to 4.37.0 ( #11859 )
2024-08-20 17:37:58 +08:00
qwen1.5
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
qwen2
Add Qwen2.5 GPU example ( #12101 )
2024-09-20 15:55:57 +08:00
qwen2.5
Add Qwen2.5 GPU example ( #12101 )
2024-09-20 15:55:57 +08:00
redpajama
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
replit
update transformers version for replit-code-v1-3b, `internlm2-chat-… ( #11811 )
2024-08-15 16:40:48 +08:00
rwkv4
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
rwkv5
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
solar
change 5 pytorch/huggingface models to fp16 ( #11894 )
2024-08-22 16:12:09 +08:00
stablelm
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
starcoder
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
vicuna
transformers==4.37, yi & yuan2 & vicuna ( #11805 )
2024-08-15 15:39:24 +08:00
yi
Pytorch models transformers version update ( #11860 )
2024-08-20 18:01:42 +08:00
yuan2
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00
README.md
Update GPU HF-Transformers example structure ( #11526 )
2024-07-08 17:58:06 +08:00