* update llama * support llama 4.41 * fix style * support minicpm * support qwen2 * support minicpm & update * support chatglm4 * support chatglm * remove print * add DynamicCompressFp8Cache & support qwen * support llama * support minicpm phi3 * update chatglm2/4 * small fix & support qwen 4.42 * remove print |
||
|---|---|---|
| .. | ||
| dev | ||
| example | ||
| portable-zip | ||
| scripts | ||
| src/ipex_llm | ||
| test | ||
| tpp | ||
| .gitignore | ||
| setup.py | ||
| version.txt | ||