This website requires JavaScript.
Explore
Help
Sign In
ayo
/
ipex-llm
Watch
1
Fork
You've already forked ipex-llm
0
Code
Issues
Pull requests
Projects
Releases
Packages
Wiki
Activity
Actions
1
6584539c91
ipex-llm
/
python
/
llm
/
example
/
CPU
/
HF-Transformers-AutoModels
/
Advanced-Quantizations
History
Wang, Jian4
7ed9538b9f
LLM: support gguf mpt (
#9773
)
...
* add gguf mpt * update
2023-12-28 09:22:39 +08:00
..
AWQ
Support for Mixtral AWQ (
#9775
)
2023-12-25 16:08:09 +08:00
GGUF
LLM: support gguf mpt (
#9773
)
2023-12-28 09:22:39 +08:00
GPTQ
Uing bigdl-llm-init instead of bigdl-nano-init (
#9558
)
2023-11-30 10:10:29 +08:00