ipex-llm/docs/readthedocs/source/doc/LLM/DockerGuides/index.rst
Shengsheng Huang 7ed270a4d8
update readme docker section, fix quickstart title, remove chs figure (#11044)
* update readme and fix quickstart title, remove chs figure

* update readme according to comment

* reorganize the docker guide structure
2024-05-24 00:18:20 +08:00

14 lines
706 B
ReStructuredText

IPEX-LLM Docker Container User Guides
=====================================
In this section, you will find guides related to using IPEX-LLM with Docker, covering how to:
* `Overview of IPEX-LLM Containers <./docker_windows_gpu.html>`_
* Inference in Python/C++
* `GPU Inference in Python with IPEX-LLM <./docker_pytorch_inference_gpu.html>`_
* `VSCode LLM Development with IPEX-LLM on Intel GPU <./docker_pytorch_inference_gpu.html>`_
* `llama.cpp/Ollama/Open-WebUI with IPEX-LLM on Intel GPU <./docker_cpp_xpu_quickstart.html>`_
* Serving
* `FastChat with IPEX-LLM on Intel GPU <./fastchat_docker_quickstart.html>`_
* `vLLM with IPEX-LLM on Intel GPU <./vllm_docker_quickstart.html>`_