add instruction for chat.py

This commit is contained in:
Wang 2023-10-09 12:57:28 +08:00
parent a42c25436e
commit 3814abf95a
2 changed files with 12 additions and 1 deletions

View file

@ -22,7 +22,8 @@ RUN env DEBIAN_FRONTEND=noninteractive apt-get update && \
pip install --pre --upgrade bigdl-llm[all] && \
pip install --pre --upgrade bigdl-nano && \
# Download chat.py script
wget -P /root https://raw.githubusercontent.com/intel-analytics/BigDL/main/python/llm/portable-executable/chat.py && \
pip install --upgrade colorama && \
wget -P /root https://raw.githubusercontent.com/intel-analytics/BigDL/main/python/llm/portable-zip/chat.py && \
export PYTHONUNBUFFERED=1
ENTRYPOINT ["/bin/bash"]

View file

@ -32,3 +32,13 @@ sudo docker run -itd \
After the container is booted, you could get into the container through `docker exec`.
To run inference using `BigDL-LLM` using cpu, you could refer to this [documentation](https://github.com/intel-analytics/BigDL/tree/main/python/llm#cpu-int4).
### Use chat.py
chat.py can be used to initiate a conversation with a specified model. The file is under directory '/root'.
To run chat.py:
```
cd /root
python chat.py --model-path YOUR_MODEL_PATH
```