ipex-llm/python/llm
Qiyuan Gong 1210491748
ChatGLM3, Baichuan2 and Qwen1.5 QLoRA example (#11078)
* Add chatglm3, qwen15-7b and baichuan-7b QLoRA alpaca example
* Remove unnecessary tokenization setting.
2024-05-21 15:29:43 +08:00
..
dev Fix tgi_api_server error file name (#11075) 2024-05-20 16:48:40 +08:00
example ChatGLM3, Baichuan2 and Qwen1.5 QLoRA example (#11078) 2024-05-21 15:29:43 +08:00
portable-zip Fix baichuan-13b issue on portable zip under transformers 4.36 (#10746) 2024-04-12 16:27:01 -07:00
scripts Add driver related packages version check in env script (#10977) 2024-05-10 15:02:58 +08:00
src/ipex_llm refactor qwen (#11074) 2024-05-20 18:08:37 +08:00
test refactor qwen (#11074) 2024-05-20 18:08:37 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py LLM: Install CPU version torch with extras [all] (#10868) 2024-05-16 10:39:55 +08:00
version.txt Update setup.py and add new actions and add compatible mode (#25) 2024-03-22 15:44:59 +08:00