ipex-llm/python/llm/example
Qiyuan Gong f2e923b3ca
Axolotl v0.4.0 support (#10773)
* Add Axolotl 0.4.0, remove legacy 0.3.0 support.
* replace is_torch_bf16_gpu_available
* Add HF_HUB_OFFLINE=1
* Move transformers out of requirement
* Refine readme and qlora.yml
2024-04-17 09:49:11 +08:00
..
CPU fix_internlm-chat-7b-8k repo name in examples (#10747) 2024-04-12 10:15:48 -07:00
GPU Axolotl v0.4.0 support (#10773) 2024-04-17 09:49:11 +08:00