This website requires JavaScript.
Explore
Help
Sign In
ayo
/
ipex-llm
Watch
1
Fork
You've already forked ipex-llm
0
Code
Issues
Pull requests
Projects
Releases
Packages
Wiki
Activity
Actions
1
f9a199900d
ipex-llm
/
python
/
llm
/
example
/
GPU
/
HF-Transformers-AutoModels
/
Advanced-Quantizations
History
Shaojun Liu
ab9f7f3ac5
FIX: Qwen1.5-GPTQ-Int4 inference error (
#11432
)
...
* merge_qkv if quant_method is 'gptq' * fix python style checks * refactor * update GPU example
2024-06-26 15:36:22 +08:00
..
AWQ
Upgrade to python 3.11 (
#10711
)
2024-04-09 17:41:17 +08:00
GGUF
Miniconda/Anaconda -> Miniforge update in examples (
#11194
)
2024-06-04 10:14:02 +08:00
GGUF-IQ2
Upgrade to python 3.11 (
#10711
)
2024-04-09 17:41:17 +08:00
GPTQ
FIX: Qwen1.5-GPTQ-Int4 inference error (
#11432
)
2024-06-26 15:36:22 +08:00