update oneapi usage in cpp quickstart (#10836)
* update oneapi usage * update * small fix
This commit is contained in:
parent
ae3b577537
commit
c6e868f7ad
3 changed files with 12 additions and 16 deletions
|
|
@ -29,13 +29,7 @@ Suppose you have downloaded a [Meta-Llama-3-8B-Instruct-Q4_K_M.gguf](https://hug
|
||||||
|
|
||||||
#### 1.3 Run Llama3 on Intel GPU using llama.cpp
|
#### 1.3 Run Llama3 on Intel GPU using llama.cpp
|
||||||
|
|
||||||
##### Set Environment Variables(optional)
|
##### Set Environment Variables
|
||||||
|
|
||||||
```eval_rst
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
This is a required step on for APT or offline installed oneAPI. Skip this step for PIP-installed oneAPI.
|
|
||||||
```
|
|
||||||
|
|
||||||
Configure oneAPI variables by running the following command:
|
Configure oneAPI variables by running the following command:
|
||||||
|
|
||||||
|
|
@ -49,9 +43,14 @@ Configure oneAPI variables by running the following command:
|
||||||
|
|
||||||
.. tab:: Windows
|
.. tab:: Windows
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
|
||||||
|
This is a required step for APT or offline installed oneAPI. Skip this step for PIP-installed oneAPI.
|
||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat"
|
call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat"
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
##### Run llama3
|
##### Run llama3
|
||||||
|
|
@ -126,7 +125,6 @@ Launch the Ollama service:
|
||||||
export ZES_ENABLE_SYSMAN=1
|
export ZES_ENABLE_SYSMAN=1
|
||||||
export SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
|
export SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
|
||||||
export OLLAMA_NUM_GPU=999
|
export OLLAMA_NUM_GPU=999
|
||||||
# Below is a required step for APT or offline installed oneAPI. Skip below step for PIP-installed oneAPI.
|
|
||||||
source /opt/intel/oneapi/setvars.sh
|
source /opt/intel/oneapi/setvars.sh
|
||||||
|
|
||||||
./ollama serve
|
./ollama serve
|
||||||
|
|
|
||||||
|
|
@ -82,13 +82,7 @@ Then you can use following command to initialize `llama.cpp` with IPEX-LLM:
|
||||||
|
|
||||||
Here we provide a simple example to show how to run a community GGUF model with IPEX-LLM.
|
Here we provide a simple example to show how to run a community GGUF model with IPEX-LLM.
|
||||||
|
|
||||||
#### Set Environment Variables(optional)
|
#### Set Environment Variables
|
||||||
|
|
||||||
```eval_rst
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
This is a required step on for APT or offline installed oneAPI. Skip this step for PIP-installed oneAPI.
|
|
||||||
```
|
|
||||||
|
|
||||||
Configure oneAPI variables by running the following command:
|
Configure oneAPI variables by running the following command:
|
||||||
|
|
||||||
|
|
@ -102,9 +96,14 @@ Configure oneAPI variables by running the following command:
|
||||||
|
|
||||||
.. tab:: Windows
|
.. tab:: Windows
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
|
||||||
|
This is a required step for APT or offline installed oneAPI. Skip this step for PIP-installed oneAPI.
|
||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat"
|
call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat"
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
#### Model Download
|
#### Model Download
|
||||||
|
|
|
||||||
|
|
@ -55,7 +55,6 @@ You may launch the Ollama service as below:
|
||||||
export OLLAMA_NUM_GPU=999
|
export OLLAMA_NUM_GPU=999
|
||||||
export no_proxy=localhost,127.0.0.1
|
export no_proxy=localhost,127.0.0.1
|
||||||
export ZES_ENABLE_SYSMAN=1
|
export ZES_ENABLE_SYSMAN=1
|
||||||
# Below is a required step for APT or offline installed oneAPI. Skip below step for PIP-installed oneAPI.
|
|
||||||
source /opt/intel/oneapi/setvars.sh
|
source /opt/intel/oneapi/setvars.sh
|
||||||
|
|
||||||
./ollama serve
|
./ollama serve
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue