ipex-llm/docs/mddocs/DockerGuides
Chu,Youcheng acd77d9e87
Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445)
* fix: remove BIGDL_LLM_XMX_DISABLED in mddocs

* fix: remove set SYCL_CACHE_PERSISTENT=1 in example

* fix: remove BIGDL_LLM_XMX_DISABLED in workflows

* fix: merge igpu and A-series Graphics

* fix: remove set BIGDL_LLM_XMX_DISABLED=1 in example

* fix: remove BIGDL_LLM_XMX_DISABLED in workflows

* fix: merge igpu and A-series Graphics

* fix: textual adjustment

* fix: textual adjustment

* fix: textual adjustment
2024-11-27 11:16:36 +08:00
..
docker_cpp_xpu_quickstart.md Decouple the openwebui and the ollama. in inference-cpp-xpu dockerfile (#12382) 2024-11-12 20:15:23 +08:00
docker_pytorch_inference_gpu.md Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
docker_run_pytorch_inference_in_vscode.md Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00
docker_windows_gpu.md Further mddocs fixes (#11386) 2024-06-21 13:27:43 +08:00
fastchat_docker_quickstart.md Update mddocs for DockerGuides (#11380) 2024-06-21 12:10:35 +08:00
README.md Add index page for API doc & links update in mddocs (#11393) 2024-06-21 17:34:34 +08:00
vllm_cpu_docker_quickstart.md Add missing ragflow quickstart in mddocs and update legecy contents (#11385) 2024-06-21 12:28:26 +08:00
vllm_docker_quickstart.md update vllm-docker-quick-start for vllm0.6.2 (#12392) 2024-11-27 08:47:03 +08:00