ipex-llm/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations
Shaojun Liu ab9f7f3ac5
FIX: Qwen1.5-GPTQ-Int4 inference error (#11432)
* merge_qkv if quant_method is 'gptq'

* fix python style checks

* refactor

* update GPU example
2024-06-26 15:36:22 +08:00
..
AWQ LLM: Modify CPU Installation Command for most examples (#11049) 2024-05-17 15:52:20 +08:00
GGUF Miniconda/Anaconda -> Miniforge update in examples (#11194) 2024-06-04 10:14:02 +08:00
GPTQ FIX: Qwen1.5-GPTQ-Int4 inference error (#11432) 2024-06-26 15:36:22 +08:00