ipex-llm/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations
Heyang Sun 66e286a73d Support for Mixtral AWQ (#9775)
* Support for Mixtral AWQ

* Update README.md

* Update README.md

* Update awq_config.py

* Update README.md

* Update README.md
2023-12-25 16:08:09 +08:00
..
AWQ Support for Mixtral AWQ (#9775) 2023-12-25 16:08:09 +08:00
GGUF LLM: Add bloom gguf support (#9734) 2023-12-21 14:06:25 +08:00
GPTQ Uing bigdl-llm-init instead of bigdl-nano-init (#9558) 2023-11-30 10:10:29 +08:00