Lilac09
24146d108f
add bigdl-llm-init ( #9468 )
2023-11-15 14:55:33 +08:00
Lilac09
b2b085550b
Remove bigdl-nano and add ipex into inference-cpu image ( #9452 )
...
* remove bigdl-nano and add ipex into inference-cpu image
* remove bigdl-nano in docker
* remove bigdl-nano in docker
2023-11-14 10:50:52 +08:00
Shaojun Liu
0e5ab5ebfc
update docker tag to 2.5.0-SNAPSHOT ( #9443 )
2023-11-13 16:53:40 +08:00
Lilac09
74a8ad32dc
Add entry point to llm-serving-xpu ( #9339 )
...
* add entry point to llm-serving-xpu
* manually build
* manually build
* add entry point to llm-serving-xpu
* manually build
* add entry point to llm-serving-xpu
* add entry point to llm-serving-xpu
* add entry point to llm-serving-xpu
2023-11-02 16:31:07 +08:00
Lilac09
2c2bc959ad
add tools into previously built images ( #9317 )
...
* modify Dockerfile
* manually build
* modify Dockerfile
* add chat.py into inference-xpu
* add benchmark into inference-cpu
* manually build
* add benchmark into inference-cpu
* add benchmark into inference-cpu
* add benchmark into inference-cpu
* add chat.py into inference-xpu
* add chat.py into inference-xpu
* change ADD to COPY in dockerfile
* fix dependency issue
* temporarily remove run-spr in llm-cpu
* temporarily remove run-spr in llm-cpu
2023-10-31 16:35:18 +08:00
Guancheng Fu
7f66bc5c14
Fix bigdl-llm-serving-cpu Dockerfile ( #9247 )
2023-10-23 16:51:30 +08:00
Shaojun Liu
9dc76f19c0
fix hadolint error ( #9223 )
2023-10-19 16:22:32 +08:00
Guancheng Fu
df8df751c4
Modify readme for bigdl-llm-serving-cpu ( #9105 )
2023-10-09 09:56:09 +08:00
ZehuaCao
b773d67dd4
Add Kubernetes support for BigDL-LLM-serving CPU. ( #9071 )
2023-10-07 09:37:48 +08:00
Guancheng Fu
cc84ed70b3
Create serving images ( #9048 )
...
* Finished & Tested
* Install latest pip from base images
* Add blank line
* Delete unused comment
* fix typos
2023-09-25 15:51:45 +08:00