ipex-llm/python/llm
Yina Chen b38fb67bec
[NPU] lm head to cpu (#11943)
* lm head to cpu

* qwen2

* mv logic and add param to disable cpu_lm_head

* use env and lm_head opt to mp file

* fix

* update

* remove print
2024-08-28 16:34:07 +08:00
..
dev Quick fix benchmark script (#11938) 2024-08-27 15:29:27 +08:00
example Update llamaindex examples (#11940) 2024-08-28 14:03:44 +08:00
portable-zip Fix null pointer dereferences error. (#11125) 2024-05-30 16:16:10 +08:00
scripts fix typo in python/llm/scripts/README.md (#11536) 2024-07-09 09:53:14 +08:00
src/ipex_llm [NPU] lm head to cpu (#11943) 2024-08-28 16:34:07 +08:00
test update mlp of llama (#11897) 2024-08-22 20:34:53 +08:00
tpp OSPDT: add tpp licenses (#11165) 2024-06-06 10:59:06 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py update CORE_XE_VERSION to 2.6.0 (#11929) 2024-08-27 13:10:13 +08:00
version.txt Update pypi tag to 2.2.0.dev0 (#11895) 2024-08-22 16:48:09 +08:00