ipex-llm/python/llm/example
Yina Chen 4467645088
[NPU] Support l0 Llama groupwise (#12276)
* except lm_head

* remove

* support gw lm_head

* update

* fix

* remove run.bat

* fix style

* support llama3
2024-10-28 17:06:55 +08:00
..
CPU refactor ot remove old rope usage (#12224) 2024-10-17 17:06:09 +08:00
GPU refactor ot remove old rope usage (#12224) 2024-10-17 17:06:09 +08:00
NPU/HF-Transformers-AutoModels [NPU] Support l0 Llama groupwise (#12276) 2024-10-28 17:06:55 +08:00