ipex-llm/python
Yang Wang fcb1c618a0 using bigdl-llm fused rope for llama (#9066)
* optimize llama xpu rope

* fix bug

* fix style

* refine append cache

* remove check

* do not cache cos sin

* remove unnecessary changes

* clean up

* fix style

* check for training
2023-10-06 09:57:29 -07:00
..
llm using bigdl-llm fused rope for llama (#9066) 2023-10-06 09:57:29 -07:00