This website requires JavaScript.
Explore
Help
Sign In
ayo
/
ipex-llm
Watch
1
Fork
You've already forked ipex-llm
0
Code
Issues
Pull requests
Projects
Releases
Packages
Wiki
Activity
Actions
1
b8437a1c1e
ipex-llm
/
python
/
llm
/
example
/
GPU
/
HF-Transformers-AutoModels
/
Advanced-Quantizations
History
Wang, Jian4
b8437a1c1e
LLM: Add gguf mistral model support (
#9691
)
...
* add mistral support * need to upgrade transformers version * update
2023-12-15 13:37:39 +08:00
..
AWQ
[LLM]Add Yi-34B-AWQ to verified AWQ model. (
#9676
)
2023-12-14 09:55:47 +08:00
GGUF
LLM: Add gguf mistral model support (
#9691
)
2023-12-15 13:37:39 +08:00
GPTQ
Support directly loading gptq models from huggingface (
#9391
)
2023-11-13 20:48:12 -08:00