This website requires JavaScript.
Explore
Help
Sign In
ayo
/
ipex-llm
Watch
1
Fork
You've already forked ipex-llm
0
Code
Issues
Pull requests
Projects
Releases
Packages
Wiki
Activity
Actions
1
4ceefc9b18
ipex-llm
/
python
/
llm
/
example
/
GPU
/
HF-Transformers-AutoModels
/
Advanced-Quantizations
History
Wang, Jian4
a54cd767b1
LLM: Add gguf falcon (
#9801
)
...
* init falcon * update convert.py * update style
2024-01-03 14:49:02 +08:00
..
AWQ
Support for Mixtral AWQ (
#9775
)
2023-12-25 16:08:09 +08:00
GGUF
LLM: Add gguf falcon (
#9801
)
2024-01-03 14:49:02 +08:00
GPTQ
Revert "[LLM] IPEX auto importer turn on by default for XPU (
#9730
)" (
#9759
)
2023-12-22 16:38:24 +08:00