ipex-llm/python/llm
Ruonan Wang 0819fad34e
support Llama2-7B / Llama3-8B for NPU C++ (#12431)
* support llama2

* update

* support fused_layers=4 for Llama2-7B
2024-11-22 18:47:19 +08:00
..
dev Support vpm and resampler module of minicpm-v on NPU (#12375) 2024-11-12 15:59:55 +08:00
example support Llama2-7B / Llama3-8B for NPU C++ (#12431) 2024-11-22 18:47:19 +08:00
portable-zip Fix null pointer dereferences error. (#11125) 2024-05-30 16:16:10 +08:00
scripts fix typo in python/llm/scripts/README.md (#11536) 2024-07-09 09:53:14 +08:00
src/ipex_llm support Llama2-7B / Llama3-8B for NPU C++ (#12431) 2024-11-22 18:47:19 +08:00
test Add MiniCPM-V-2_6 to arc perf test (#12349) 2024-11-06 16:32:28 +08:00
tpp OSPDT: add tpp licenses (#11165) 2024-06-06 10:59:06 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py Upgrade denpendency for xpu_lnl and xpu_arl option (#12424) 2024-11-21 18:37:15 +08:00
version.txt Update pypi tag to 2.2.0.dev0 (#11895) 2024-08-22 16:48:09 +08:00