ipex-llm/python/llm/example/NPU/HF-Transformers-AutoModels
2024-08-22 11:09:12 +08:00
..
LLM Support qwen2-1.5b with fused decoderlayer optimization on NPU (#11888) 2024-08-22 11:09:12 +08:00
Multimodal Update npu multimodal example (#11773) 2024-08-13 14:14:59 +08:00