ipex-llm/python/llm
Cengguang Zhang cfed76b2ed
LLM: add long-context support for Qwen1.5-7B/Baichuan2-7B/Mistral-7B. (#10937)
* LLM: add split tensor support for baichuan2-7b and qwen1.5-7b.

* fix style.

* fix style.

* fix style.

* add support for mistral and fix condition threshold.

* fix  style.

* fix comments.
2024-05-10 16:40:15 +08:00
..
dev update (#10944) 2024-05-08 14:28:05 +08:00
example Add cohere example (#10954) 2024-05-08 17:19:59 +08:00
portable-zip Fix baichuan-13b issue on portable zip under transformers 4.36 (#10746) 2024-04-12 16:27:01 -07:00
scripts Add driver related packages version check in env script (#10977) 2024-05-10 15:02:58 +08:00
src/ipex_llm LLM: add long-context support for Qwen1.5-7B/Baichuan2-7B/Mistral-7B. (#10937) 2024-05-10 16:40:15 +08:00
test Update igpu perf internlm (#10958) 2024-05-08 14:16:43 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py Support llama-index install option for upstreaming purposes (#10866) 2024-04-23 19:08:29 +08:00
version.txt Update setup.py and add new actions and add compatible mode (#25) 2024-03-22 15:44:59 +08:00