Update README.md (#10833)
This commit is contained in:
parent
5f95054f97
commit
ae3b577537
1 changed files with 3 additions and 1 deletions
|
|
@ -136,6 +136,8 @@ Currently, for vLLM-v2, we support the following models:
|
|||
Install the dependencies for vLLM-v2 as follows:
|
||||
|
||||
```bash
|
||||
# This directory may change depends on where you install oneAPI-basekit
|
||||
source /opt/intel/oneapi/setvars.sh
|
||||
# First create an conda environment
|
||||
conda create -n ipex-vllm python=3.11
|
||||
conda activate ipex-vllm
|
||||
|
|
@ -200,4 +202,4 @@ Then you can access the api server as follows:
|
|||
"max_tokens": 128,
|
||||
"temperature": 0
|
||||
}' &
|
||||
```
|
||||
```
|
||||
|
|
|
|||
Loading…
Reference in a new issue