ipex-llm/python/llm
Cengguang Zhang 58b57177e3
LLM: support bigdl quantize kv cache env and add warning. (#10623)
* LLM: support bigdl quantize kv cache env and add warnning.

* fix style.

* fix comments.
2024-04-02 15:41:08 +08:00
..
dev LLM: remove ipex.optimize for gpt-j (#10606) 2024-04-01 12:21:49 +08:00
example LLM: remove ipex.optimize for gpt-j (#10606) 2024-04-01 12:21:49 +08:00
portable-zip Update_document by heyang (#30) 2024-03-25 10:06:02 +08:00
scripts LLM: check user env (#10580) 2024-03-29 17:19:34 +08:00
src/ipex_llm LLM: support bigdl quantize kv cache env and add warning. (#10623) 2024-04-02 15:41:08 +08:00
test Modify the link in Langchain-upstream ut (#10608) 2024-04-01 17:03:40 +08:00
.gitignore [LLM] add chatglm pybinding binary file release (#8677) 2023-08-04 11:45:27 +08:00
setup.py Update pip install to use --extra-index-url for ipex package (#10557) 2024-03-28 09:56:23 +08:00
version.txt Update setup.py and add new actions and add compatible mode (#25) 2024-03-22 15:44:59 +08:00