* update readme and fix quickstart title, remove chs figure * update readme according to comment * reorganize the docker guide structure
		
			
				
	
	
		
			14 lines
		
	
	
	
		
			706 B
		
	
	
	
		
			ReStructuredText
		
	
	
	
	
	
			
		
		
	
	
			14 lines
		
	
	
	
		
			706 B
		
	
	
	
		
			ReStructuredText
		
	
	
	
	
	
IPEX-LLM Docker Container User Guides
 | 
						|
=====================================
 | 
						|
 | 
						|
In this section, you will find guides related to using IPEX-LLM with Docker, covering how to:
 | 
						|
 | 
						|
* `Overview of IPEX-LLM Containers <./docker_windows_gpu.html>`_
 | 
						|
 | 
						|
* Inference in Python/C++  
 | 
						|
   * `GPU Inference in Python with IPEX-LLM <./docker_pytorch_inference_gpu.html>`_
 | 
						|
   * `VSCode LLM Development with IPEX-LLM on Intel GPU <./docker_pytorch_inference_gpu.html>`_
 | 
						|
   * `llama.cpp/Ollama/Open-WebUI with IPEX-LLM on Intel GPU <./docker_cpp_xpu_quickstart.html>`_
 | 
						|
* Serving
 | 
						|
   * `FastChat with IPEX-LLM on Intel GPU <./fastchat_docker_quickstart.html>`_
 | 
						|
   * `vLLM with IPEX-LLM on Intel GPU <./vllm_docker_quickstart.html>`_
 |