fix ollama quickstart(#10846)
This commit is contained in:
parent
fe5a082b84
commit
2ec45c49d3
1 changed files with 1 additions and 0 deletions
|
|
@ -81,6 +81,7 @@ You may launch the Ollama service as below:
|
|||
Please set environment variable ``OLLAMA_NUM_GPU`` to ``999`` to make sure all layers of your model are running on Intel GPU, otherwise, some layers may run on CPU.
|
||||
```
|
||||
|
||||
```eval_rst
|
||||
.. note::
|
||||
|
||||
To allow the service to accept connections from all IP addresses, use `OLLAMA_HOST=0.0.0.0 ./ollama serve` instead of just `./ollama serve`.
|
||||
|
|
|
|||
Loading…
Reference in a new issue