ipex-llm/python/llm
Jinhe d0c89fb715
updated llama.cpp and ollama quickstart (#11732)
* updated llama.cpp and ollama quickstart.md

* added qwen2-1.5B sample output

* revision on quickstart updates

* revision on quickstart updates

* revision on qwen2 readme

* added 2 troubleshoots“
”

* troubleshoot revision
2024-08-08 11:04:01 +08:00
..
dev Add benchmark util for transformers 4.42 (#11725) 2024-08-07 08:48:07 +08:00
example updated llama.cpp and ollama quickstart (#11732) 2024-08-08 11:04:01 +08:00
portable-zip Fix null pointer dereferences error. (#11125) 2024-05-30 16:16:10 +08:00
scripts fix typo in python/llm/scripts/README.md (#11536) 2024-07-09 09:53:14 +08:00
src/ipex_llm support and optimize minicpm-v-2_6 (#11738) 2024-08-07 18:21:16 +08:00
test Use merge_qkv to replace fused_qkv for llama2 (#11727) 2024-08-07 18:04:01 +08:00
tpp OSPDT: add tpp licenses (#11165) 2024-06-06 10:59:06 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py update doc/setup to use onednn gemm for cpp (#11598) 2024-07-18 13:04:38 +08:00
version.txt Update setup.py and add new actions and add compatible mode (#25) 2024-03-22 15:44:59 +08:00