fix doc (#9599)
This commit is contained in:
		
							parent
							
								
									f211f136b6
								
							
						
					
					
						commit
						8b00653039
					
				
					 2 changed files with 2 additions and 2 deletions
				
			
		| 
						 | 
					@ -42,4 +42,4 @@ root@arda-arc12:/# sycl-ls
 | 
				
			||||||
```
 | 
					```
 | 
				
			||||||
 | 
					
 | 
				
			||||||
 | 
					
 | 
				
			||||||
To run inference using `BigDL-LLM` using xpu, you could refer to this [documentation](https://github.com/intel-analytics/BigDL/tree/main/python/llm/example/gpu).
 | 
					To run inference using `BigDL-LLM` using xpu, you could refer to this [documentation](https://github.com/intel-analytics/BigDL/tree/main/python/llm/example/GPU).
 | 
				
			||||||
| 
						 | 
					
 | 
				
			||||||
| 
						 | 
					@ -38,7 +38,7 @@ pip3 install psutil
 | 
				
			||||||
pip3 install sentencepiece  # Required for LLaMA tokenizer.
 | 
					pip3 install sentencepiece  # Required for LLaMA tokenizer.
 | 
				
			||||||
pip3 install numpy
 | 
					pip3 install numpy
 | 
				
			||||||
pip3 install "transformers>=4.33.1"  # Required for Code Llama.
 | 
					pip3 install "transformers>=4.33.1"  # Required for Code Llama.
 | 
				
			||||||
pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu
 | 
					pip install --pre --upgrade "bigdl-llm[xpu]" -f https://developer.intel.com/ipex-whl-stable-xpu
 | 
				
			||||||
pip3 install fastapi
 | 
					pip3 install fastapi
 | 
				
			||||||
pip3 install "uvicorn[standard]"
 | 
					pip3 install "uvicorn[standard]"
 | 
				
			||||||
pip3 install "pydantic<2"  # Required for OpenAI server.
 | 
					pip3 install "pydantic<2"  # Required for OpenAI server.
 | 
				
			||||||
| 
						 | 
					
 | 
				
			||||||
		Loading…
	
		Reference in a new issue