ipex-llm/python/llm
Qiyuan Gong 120a0035ac
Fix type mismatch in eval for Baichuan2 QLora example (#11117)
* During the evaluation stage, Baichuan2 will raise type mismatch when training with bfloat16. Fix this issue by modifying modeling_baichuan.py. Add doc about how to modify this file.
2024-05-24 14:14:30 +08:00
..
dev Update linux igpu run script (#11098) 2024-05-22 17:18:07 +08:00
example Fix type mismatch in eval for Baichuan2 QLora example (#11117) 2024-05-24 14:14:30 +08:00
portable-zip Fix baichuan-13b issue on portable zip under transformers 4.36 (#10746) 2024-04-12 16:27:01 -07:00
scripts Add driver related packages version check in env script (#10977) 2024-05-10 15:02:58 +08:00
src/ipex_llm optimize internlm2 xcomposer agin (#11124) 2024-05-24 13:44:52 +08:00
test Update tests for transformers 4.36 (#10858) 2024-05-24 10:26:38 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py Update tests for transformers 4.36 (#10858) 2024-05-24 10:26:38 +08:00
version.txt Update setup.py and add new actions and add compatible mode (#25) 2024-03-22 15:44:59 +08:00