* Small fixes * Add initial api doc index * Change index.md -> README.md * Fix on API links
16 lines
724 B
Markdown
16 lines
724 B
Markdown
# IPEX-LLM Docker Container User Guides
|
|
|
|
|
|
In this section, you will find guides related to using IPEX-LLM with Docker, covering how to:
|
|
|
|
- [Overview of IPEX-LLM Containers](./docker_windows_gpu.md)
|
|
|
|
- Inference in Python/C++
|
|
- [GPU Inference in Python with IPEX-LLM](./docker_pytorch_inference_gpu.md)
|
|
- [VSCode LLM Development with IPEX-LLM on Intel GPU](./docker_run_pytorch_inference_in_vscode.md)
|
|
- [llama.cpp/Ollama/Open-WebUI with IPEX-LLM on Intel GPU](./docker_cpp_xpu_quickstart.md)
|
|
|
|
- Serving
|
|
- [FastChat with IPEX-LLM on Intel GPU](./fastchat_docker_quickstart.md)
|
|
- [vLLM with IPEX-LLM on Intel GPU](./vllm_docker_quickstart.md)
|
|
- [vLLM with IPEX-LLM on Intel CPU](./vllm_cpu_docker_quickstart.md)
|