add instruction for chat.py
This commit is contained in:
		
							parent
							
								
									a42c25436e
								
							
						
					
					
						commit
						3814abf95a
					
				
					 2 changed files with 12 additions and 1 deletions
				
			
		| 
						 | 
					@ -22,7 +22,8 @@ RUN env DEBIAN_FRONTEND=noninteractive apt-get update && \
 | 
				
			||||||
    pip install --pre --upgrade bigdl-llm[all] && \
 | 
					    pip install --pre --upgrade bigdl-llm[all] && \
 | 
				
			||||||
    pip install --pre --upgrade bigdl-nano && \
 | 
					    pip install --pre --upgrade bigdl-nano && \
 | 
				
			||||||
# Download chat.py script
 | 
					# Download chat.py script
 | 
				
			||||||
    wget -P /root https://raw.githubusercontent.com/intel-analytics/BigDL/main/python/llm/portable-executable/chat.py && \
 | 
					    pip install --upgrade colorama && \
 | 
				
			||||||
 | 
					    wget -P /root https://raw.githubusercontent.com/intel-analytics/BigDL/main/python/llm/portable-zip/chat.py && \
 | 
				
			||||||
    export PYTHONUNBUFFERED=1
 | 
					    export PYTHONUNBUFFERED=1
 | 
				
			||||||
 | 
					
 | 
				
			||||||
ENTRYPOINT ["/bin/bash"]
 | 
					ENTRYPOINT ["/bin/bash"]
 | 
				
			||||||
| 
						 | 
					@ -32,3 +32,13 @@ sudo docker run -itd \
 | 
				
			||||||
After the container is booted, you could get into the container through `docker exec`.
 | 
					After the container is booted, you could get into the container through `docker exec`.
 | 
				
			||||||
 | 
					
 | 
				
			||||||
To run inference using `BigDL-LLM` using cpu, you could refer to this [documentation](https://github.com/intel-analytics/BigDL/tree/main/python/llm#cpu-int4).
 | 
					To run inference using `BigDL-LLM` using cpu, you could refer to this [documentation](https://github.com/intel-analytics/BigDL/tree/main/python/llm#cpu-int4).
 | 
				
			||||||
 | 
					
 | 
				
			||||||
 | 
					### Use chat.py
 | 
				
			||||||
 | 
					
 | 
				
			||||||
 | 
					chat.py can be used to initiate a conversation with a specified model. The file is under directory '/root'.
 | 
				
			||||||
 | 
					
 | 
				
			||||||
 | 
					To run chat.py:
 | 
				
			||||||
 | 
					```
 | 
				
			||||||
 | 
					cd /root
 | 
				
			||||||
 | 
					python chat.py --model-path YOUR_MODEL_PATH
 | 
				
			||||||
 | 
					```
 | 
				
			||||||
| 
						 | 
					
 | 
				
			||||||
		Loading…
	
		Reference in a new issue