ipex-llm/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations
2023-12-13 15:39:31 +08:00
..
AWQ verfiy codeLlama (#9668) 2023-12-13 15:39:31 +08:00
GGUF Update GGUF readme (#9611) 2023-12-06 18:21:54 +08:00
GPTQ Support directly loading gptq models from huggingface (#9391) 2023-11-13 20:48:12 -08:00