update troubleshooting (#11960)
This commit is contained in:
parent
882f4a5ff7
commit
5f7ff76ea5
1 changed files with 4 additions and 0 deletions
|
|
@ -126,6 +126,7 @@ Arguments info:
|
|||
|
||||
### Troubleshooting
|
||||
|
||||
#### Output Problem
|
||||
If you encounter output problem, please try to disable the optimization of transposing value cache with following command:
|
||||
```bash
|
||||
# to run Llama-2-7b-chat-hf
|
||||
|
|
@ -144,6 +145,9 @@ python minicpm.py --disable-transpose-value-cache
|
|||
python minicpm.py --repo-id-or-model-path openbmb/MiniCPM-2B-sft-bf16 --disable-transpose-value-cache
|
||||
```
|
||||
|
||||
#### High CPU Utilization
|
||||
You can reduce CPU utilization by setting the environment variable with `set IPEX_LLM_CPU_LM_HEAD=0`.
|
||||
|
||||
|
||||
### Sample Output
|
||||
#### [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)
|
||||
|
|
|
|||
Loading…
Reference in a new issue