ipex-llm/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations
2023-11-24 16:20:22 +08:00
..
AWQ LLM: support Mistral AWQ models (#9520) 2023-11-24 16:20:22 +08:00
GPTQ Support directly loading gptq models from huggingface (#9391) 2023-11-13 20:48:12 -08:00