add chat.py denpendency in Dockerfile (#9699)
This commit is contained in:
parent
c00a9144c4
commit
a5c481fedd
2 changed files with 4 additions and 4 deletions
|
|
@ -73,7 +73,7 @@ You can download models and bind the model directory from host machine to contai
|
|||
|
||||
After entering the container through `docker exec`, you can run chat.py by:
|
||||
```bash
|
||||
cd /llm
|
||||
cd /llm/portable-zip
|
||||
python chat.py --model-path YOUR_MODEL_PATH
|
||||
```
|
||||
If your model is chatglm-6b and mounted on /llm/models, you can excute:
|
||||
|
|
|
|||
|
|
@ -30,12 +30,12 @@ RUN env DEBIAN_FRONTEND=noninteractive apt-get update && \
|
|||
pip install --upgrade jupyterlab && \
|
||||
git clone https://github.com/intel-analytics/bigdl-llm-tutorial && \
|
||||
chmod +x /llm/start-notebook.sh && \
|
||||
# Download chat.py script
|
||||
pip install --upgrade colorama && \
|
||||
wget -P /llm https://raw.githubusercontent.com/intel-analytics/BigDL/main/python/llm/portable-zip/chat.py && \
|
||||
# Download all-in-one benchmark
|
||||
git clone https://github.com/intel-analytics/BigDL && \
|
||||
cp -r ./BigDL/python/llm/dev/benchmark/ ./benchmark && \
|
||||
# Copy chat.py script
|
||||
pip install --upgrade colorama && \
|
||||
cp -r ./BigDL/python/llm/portable-zip/ ./portable-zip && \
|
||||
# Install all-in-one dependencies
|
||||
apt-get install -y numactl && \
|
||||
pip install --upgrade omegaconf && \
|
||||
|
|
|
|||
Loading…
Reference in a new issue