diff --git a/docs/readthedocs/source/doc/LLM/Overview/install_cpu.md b/docs/readthedocs/source/doc/LLM/Overview/install_cpu.md index bb2b952c..53342b77 100644 --- a/docs/readthedocs/source/doc/LLM/Overview/install_cpu.md +++ b/docs/readthedocs/source/doc/LLM/Overview/install_cpu.md @@ -17,7 +17,7 @@ Please refer to [Environment Setup](#environment-setup) for more information. .. important:: - ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11; Python 3.9 is recommended for best practices. + ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11; Python 3.11 is recommended for best practices. ``` ## Recommended Requirements @@ -39,10 +39,10 @@ Here list the recommended hardware and OS for smooth IPEX-LLM optimization exper For optimal performance with LLM models using IPEX-LLM optimizations on Intel CPUs, here are some best practices for setting up environment: -First we recommend using [Conda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.9 enviroment: +First we recommend using [Conda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.11 enviroment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md b/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md index 22f49e1f..aead0851 100644 --- a/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md @@ -22,10 +22,10 @@ To apply Intel GPU acceleration, there're several prerequisite steps for tools i * Step 4: Install IntelĀ® oneAPI Base Toolkit 2024.0: - First, Create a Python 3.9 enviroment and activate it. In Anaconda Prompt: + First, Create a Python 3.11 enviroment and activate it. In Anaconda Prompt: ```cmd - conda create -n llm python=3.9 libuv + conda create -n llm python=3.11 libuv conda activate llm ``` @@ -33,7 +33,7 @@ To apply Intel GPU acceleration, there're several prerequisite steps for tools i ```eval_rst .. important:: - ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices. + ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.11 is recommended for best practices. ``` Then, use `pip` to install the Intel oneAPI Base Toolkit 2024.0: @@ -111,7 +111,7 @@ pip install --pre --upgrade ipex-llm[xpu] ```eval_rst .. note:: - All the wheel packages mentioned here are for Python 3.9. If you would like to use Python 3.10 or 3.11, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively. + All the wheel packages mentioned here are for Python 3.11. If you would like to use Python 3.9 or 3.10, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively. ``` ### Runtime Configuration @@ -164,7 +164,7 @@ If you met error when importing `intel_extension_for_pytorch`, please ensure tha * Ensure that `libuv` is installed in your conda environment. This can be done during the creation of the environment with the command: ```cmd - conda create -n llm python=3.9 libuv + conda create -n llm python=3.11 libuv ``` If you missed `libuv`, you can add it to your existing environment through ```cmd @@ -399,12 +399,12 @@ IPEX-LLM GPU support on Linux has been verified on: ### Install IPEX-LLM #### Install IPEX-LLM From PyPI -We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.9 enviroment: +We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.11 enviroment: ```eval_rst .. important:: - ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices. + ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.11 is recommended for best practices. ``` ```eval_rst @@ -422,7 +422,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t .. code-block:: bash - conda create -n llm python=3.9 + conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -439,7 +439,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t .. code-block:: bash - conda create -n llm python=3.9 + conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ @@ -461,7 +461,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t .. code-block:: bash - conda create -n llm python=3.9 + conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -470,7 +470,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t .. code-block:: bash - conda create -n llm python=3.9 + conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ @@ -530,7 +530,7 @@ If you encounter network issues when installing IPEX, you can also install IPEX- ```eval_rst .. note:: - All the wheel packages mentioned here are for Python 3.9. If you would like to use Python 3.10 or 3.11, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively. + All the wheel packages mentioned here are for Python 3.11. If you would like to use Python 3.9 or 3.10, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively. ``` ### Runtime Configuration diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/continue_quickstart.md b/docs/readthedocs/source/doc/LLM/Quickstart/continue_quickstart.md index 1f692f5b..1370a233 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/continue_quickstart.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/continue_quickstart.md @@ -28,7 +28,7 @@ This guide walks you through setting up and running **Continue** within _Visual Visit [Run Text Generation WebUI Quickstart Guide](webui_quickstart.html), and follow the steps 1) [Install IPEX-LLM](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-ipex-llm), 2) [Install WebUI](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-the-webui) and 3) [Start the Server](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#start-the-webui-server) to install and start the Text Generation WebUI API Service. **Please pay attention to below items during installation:** -- The Text Generation WebUI API service requires Python version 3.10 or higher. But [IPEX-LLM installation instructions](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-ipex-llm) used ``python=3.9`` as default for creating the conda environment. We recommend changing it to ``3.11``, using below command: +- The Text Generation WebUI API service requires Python version 3.10 or higher. But [IPEX-LLM installation instructions](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-ipex-llm) used ``python=3.11`` as default for creating the conda environment. We recommend changing it to ``3.11``, using below command: ```bash conda create -n llm python=3.11 libuv ``` diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md b/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md index 157e03f4..efcf95b1 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md @@ -144,7 +144,7 @@ You can use `conda --version` to verify you conda installation. After installation, create a new python environment `llm`: ```cmd -conda create -n llm python=3.9 +conda create -n llm python=3.11 ``` Activate the newly created environment `llm`: ```cmd diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/install_windows_gpu.md b/docs/readthedocs/source/doc/LLM/Quickstart/install_windows_gpu.md index 14439e70..6a0c2e78 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/install_windows_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/install_windows_gpu.md @@ -57,7 +57,7 @@ Visit [Miniconda installation page](https://docs.anaconda.com/free/miniconda/), Open the **Anaconda Prompt**. Then create a new python environment `llm` and activate it: ```cmd -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm ``` diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/llama_cpp_quickstart.md b/docs/readthedocs/source/doc/LLM/Quickstart/llama_cpp_quickstart.md index 6c0c6784..4736b6dc 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/llama_cpp_quickstart.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/llama_cpp_quickstart.md @@ -26,7 +26,7 @@ Visit the [Install IPEX-LLM on Windows with Intel GPU Guide](https://ipex-llm.re To use `llama.cpp` with IPEX-LLM, first ensure that `ipex-llm[cpp]` is installed. ```cmd -conda create -n llm-cpp python=3.9 +conda create -n llm-cpp python=3.11 conda activate llm-cpp pip install --pre --upgrade ipex-llm[cpp] ``` diff --git a/python/llm/example/CPU/Applications/autogen/README.md b/python/llm/example/CPU/Applications/autogen/README.md index ceb9fd7a..de045510 100644 --- a/python/llm/example/CPU/Applications/autogen/README.md +++ b/python/llm/example/CPU/Applications/autogen/README.md @@ -11,7 +11,7 @@ mkdir autogen cd autogen # create respective conda environment -conda create -n autogen python=3.9 +conda create -n autogen python=3.11 conda activate autogen # install fastchat-adapted ipex-llm diff --git a/python/llm/example/CPU/Applications/hf-agent/README.md b/python/llm/example/CPU/Applications/hf-agent/README.md index edbae072..455f10ed 100644 --- a/python/llm/example/CPU/Applications/hf-agent/README.md +++ b/python/llm/example/CPU/Applications/hf-agent/README.md @@ -10,7 +10,7 @@ To run this example with IPEX-LLM, we have some recommended requirements for you ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/Applications/streaming-llm/README.md b/python/llm/example/CPU/Applications/streaming-llm/README.md index a008b1d2..571f51a3 100644 --- a/python/llm/example/CPU/Applications/streaming-llm/README.md +++ b/python/llm/example/CPU/Applications/streaming-llm/README.md @@ -10,7 +10,7 @@ model = AutoModelForCausalLM.from_pretrained(model_name_or_path, load_in_4bit=Tr ## Prepare Environment We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] diff --git a/python/llm/example/CPU/Deepspeed-AutoTP/README.md b/python/llm/example/CPU/Deepspeed-AutoTP/README.md index ed738567..45256563 100644 --- a/python/llm/example/CPU/Deepspeed-AutoTP/README.md +++ b/python/llm/example/CPU/Deepspeed-AutoTP/README.md @@ -2,7 +2,7 @@ #### 1. Install Dependencies -Install necessary packages (here Python 3.9 is our test environment): +Install necessary packages (here Python 3.11 is our test environment): ```bash bash install.sh diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md index cecbe84a..b3078cbd 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md @@ -34,7 +34,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a AWQ We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install autoawq==0.1.8 --no-deps diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md index 33c28850..4741e604 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md @@ -25,7 +25,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md index d91f997e..139fa014 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila/README.md index 63468b19..8b3cfbf3 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila2/README.md index 50e7b83d..fd06613c 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila2/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan/README.md index b7ed859e..6b8d421d 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Baic ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan2/README.md index e5d9a1aa..e9e28200 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan2/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Baic ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/bluelm/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/bluelm/README.md index addec52f..328a86b7 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/bluelm/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/bluelm/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Blue ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm/README.md index d56d070e..09172bcb 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm2/README.md index 54acc3b6..8a99eebe 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option @@ -80,7 +80,7 @@ In the example [streamchat.py](./streamchat.py), we show a basic use case for a ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm3/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm3/README.md index 966f0894..4b5f2174 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm3/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm3/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option @@ -81,7 +81,7 @@ In the example [streamchat.py](./streamchat.py), we show a basic use case for a ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codellama/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codellama/README.md index be3687cf..10035051 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codellama/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codellama/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Code ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codeshell/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codeshell/README.md index 59c935c5..a3399ab8 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codeshell/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/codeshell/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md index 420627c5..ac818695 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Deci ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek-moe/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek-moe/README.md index ece21c6f..3fd87ae7 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek-moe/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek-moe/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek/README.md index 232ca8be..e38600b7 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Deep ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md index 882671c6..92d863b1 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v1/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v1/README.md index d59677ba..1e599b4e 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v1/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v1/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Doll ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v2/README.md index 219e13ee..b06f61cc 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v2/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Doll ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/falcon/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/falcon/README.md index 20a19a76..ca7b5f45 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/falcon/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/falcon/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Falc ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/flan-t5/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/flan-t5/README.md index 2d102180..2daa684f 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/flan-t5/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/flan-t5/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/fuyu/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/fuyu/README.md index e54a8546..8bf15bd1 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/fuyu/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/fuyu/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/gemma/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/gemma/README.md index 548529c8..c8572e04 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/gemma/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/gemma/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ pip install transformers==4.38.1 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm-xcomposer/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm-xcomposer/README.md index cb898b32..97235dd6 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm-xcomposer/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm-xcomposer/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm/README.md index b37e342c..e994db9e 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Inte ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm2/README.md index c7d8022a..01f399b9 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Inte ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/llama2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/llama2/README.md index 191102eb..68415979 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/llama2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/llama2/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mistral/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mistral/README.md index 40fbd43d..d27fc1e7 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mistral/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mistral/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mixtral/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mixtral/README.md index edd46b62..0f9ce865 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mixtral/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mixtral/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install PyTorch CPU as default diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/moss/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/moss/README.md index a0eeeccb..0355daa9 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/moss/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/moss/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a MOSS ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mpt/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mpt/README.md index e70aa2ac..5efb7172 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mpt/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/mpt/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for an MPT ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md index 7d9ece5b..e92d306b 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md index caf033f3..10cebf03 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phi-2/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phixtral/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phixtral/README.md index 918c081a..2696aeb3 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phixtral/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phixtral/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phoenix/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phoenix/README.md index 601eb997..9b162d2f 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phoenix/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/phoenix/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Phoe ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md index bd1b66d4..16f5243c 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen/README.md index ce689b6f..c94d76a3 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen/README.md @@ -15,7 +15,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Qwen We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md index 52037de5..e4043709 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Qwen ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/redpajama/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/redpajama/README.md index 0692286f..0e9e0c38 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/redpajama/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/redpajama/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a RedP ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/replit/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/replit/README.md index 0ce3bbed..285b8040 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/replit/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/replit/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/skywork/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/skywork/README.md index 53b790f2..75f81fd8 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/skywork/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/skywork/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Skyw ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/solar/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/solar/README.md index cdfe9b8f..51c1a6b6 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/solar/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/solar/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a SOLA ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/stablelm/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/stablelm/README.md index 5d99e902..d3e9854a 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/stablelm/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/stablelm/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/starcoder/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/starcoder/README.md index d81e438b..20cc936f 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/starcoder/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/starcoder/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for an Sta ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/vicuna/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/vicuna/README.md index 89604bc6..9ed7ac15 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/vicuna/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/vicuna/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Vicu ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/whisper/readme.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/whisper/readme.md index d2e957e6..29f72a29 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/whisper/readme.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/whisper/readme.md @@ -10,7 +10,7 @@ In the example [recognize.py](./recognize.py), we show a basic use case for a Wh ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option @@ -66,7 +66,7 @@ In the example [long-segment-recognize.py](./long-segment-recognize.py), we show ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/wizardcoder-python/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/wizardcoder-python/README.md index 25d6f20e..1801214a 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/wizardcoder-python/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/wizardcoder-python/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Wiza ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yi/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yi/README.md index 2205a4af..829af83f 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yi/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yi/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yuan2/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yuan2/README.md index 05f7a32f..96c08614 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yuan2/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/yuan2/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/ziya/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/ziya/README.md index 2dfb7adc..9d1fa08c 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Model/ziya/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Model/ziya/README.md @@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/More-Data-Types/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/More-Data-Types/README.md index d5dc789c..93284b2e 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/More-Data-Types/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/More-Data-Types/README.md @@ -5,7 +5,7 @@ In this example, we show a pipeline to apply IPEX-LLM low-bit optimizations (inc ## Prepare Environment We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] diff --git a/python/llm/example/CPU/HF-Transformers-AutoModels/Save-Load/README.md b/python/llm/example/CPU/HF-Transformers-AutoModels/Save-Load/README.md index d5dc789c..93284b2e 100644 --- a/python/llm/example/CPU/HF-Transformers-AutoModels/Save-Load/README.md +++ b/python/llm/example/CPU/HF-Transformers-AutoModels/Save-Load/README.md @@ -5,7 +5,7 @@ In this example, we show a pipeline to apply IPEX-LLM low-bit optimizations (inc ## Prepare Environment We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] diff --git a/python/llm/example/CPU/ModelScope-Models/README.md b/python/llm/example/CPU/ModelScope-Models/README.md index 8be1159d..d416a8ea 100644 --- a/python/llm/example/CPU/ModelScope-Models/README.md +++ b/python/llm/example/CPU/ModelScope-Models/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/Native-Models/README.md b/python/llm/example/CPU/Native-Models/README.md index 8a181ce6..1a2d80a8 100644 --- a/python/llm/example/CPU/Native-Models/README.md +++ b/python/llm/example/CPU/Native-Models/README.md @@ -7,7 +7,7 @@ In this example, we show a pipeline to convert a large language model to IPEX-LL ## Prepare Environment We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] diff --git a/python/llm/example/CPU/PyTorch-Models/Model/aquila2/README.md b/python/llm/example/CPU/PyTorch-Models/Model/aquila2/README.md index 67189d60..2c9cd008 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/aquila2/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/aquila2/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/bark/README.md b/python/llm/example/CPU/PyTorch-Models/Model/bark/README.md index 5800acae..ba4f4282 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/bark/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/bark/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/bert/README.md b/python/llm/example/CPU/PyTorch-Models/Model/bert/README.md index 2bbbe626..5bfe4e00 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/bert/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/bert/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/bluelm/README.md b/python/llm/example/CPU/PyTorch-Models/Model/bluelm/README.md index 437d9834..a68f2cb8 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/bluelm/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/bluelm/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/chatglm/README.md b/python/llm/example/CPU/PyTorch-Models/Model/chatglm/README.md index 35a15620..be040a03 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/chatglm/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/chatglm/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/chatglm3/README.md b/python/llm/example/CPU/PyTorch-Models/Model/chatglm3/README.md index 195fb0ee..3ee550a4 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/chatglm3/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/chatglm3/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/codellama/README.md b/python/llm/example/CPU/PyTorch-Models/Model/codellama/README.md index a97c5bb8..9915ffd9 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/codellama/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/codellama/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/codeshell/README.md b/python/llm/example/CPU/PyTorch-Models/Model/codeshell/README.md index 1870c4de..dff6f8e8 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/codeshell/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/codeshell/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/deciLM-7b/README.md b/python/llm/example/CPU/PyTorch-Models/Model/deciLM-7b/README.md index 15dca0bd..bf92a5b6 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/deciLM-7b/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/deciLM-7b/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/deepseek-moe/README.md b/python/llm/example/CPU/PyTorch-Models/Model/deepseek-moe/README.md index feca7acf..fa9b9945 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/deepseek-moe/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/deepseek-moe/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/deepseek/README.md b/python/llm/example/CPU/PyTorch-Models/Model/deepseek/README.md index bbbb304e..9e86fa27 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/deepseek/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/deepseek/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/distil-whisper/README.md b/python/llm/example/CPU/PyTorch-Models/Model/distil-whisper/README.md index 56efd231..ff777d77 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/distil-whisper/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/distil-whisper/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/flan-t5/README.md b/python/llm/example/CPU/PyTorch-Models/Model/flan-t5/README.md index 2d102180..2daa684f 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/flan-t5/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/flan-t5/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/fuyu/README.md b/python/llm/example/CPU/PyTorch-Models/Model/fuyu/README.md index e54a8546..8bf15bd1 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/fuyu/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/fuyu/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/internlm-xcomposer/README.md b/python/llm/example/CPU/PyTorch-Models/Model/internlm-xcomposer/README.md index cedaab04..eda342d8 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/internlm-xcomposer/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/internlm-xcomposer/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/internlm2/README.md b/python/llm/example/CPU/PyTorch-Models/Model/internlm2/README.md index 7e55c5f3..f8c1ff8c 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/internlm2/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/internlm2/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/llama2/README.md b/python/llm/example/CPU/PyTorch-Models/Model/llama2/README.md index a630cc0c..2227e0dc 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/llama2/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/llama2/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/llava/README.md b/python/llm/example/CPU/PyTorch-Models/Model/llava/README.md index d9b2b853..db7cec5b 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/llava/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/llava/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/mamba/README.md b/python/llm/example/CPU/PyTorch-Models/Model/mamba/README.md index d649ffdb..5950791f 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/mamba/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/mamba/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/meta-llama/README.md b/python/llm/example/CPU/PyTorch-Models/Model/meta-llama/README.md index e3c040fa..4c0ccb20 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/meta-llama/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/meta-llama/README.md @@ -10,7 +10,7 @@ In the example [example_chat_completion.py](./example_chat_completion.py), we sh ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # Install meta-llama repository diff --git a/python/llm/example/CPU/PyTorch-Models/Model/mistral/README.md b/python/llm/example/CPU/PyTorch-Models/Model/mistral/README.md index 1f958267..8a4adbcd 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/mistral/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/mistral/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/mixtral/README.md b/python/llm/example/CPU/PyTorch-Models/Model/mixtral/README.md index 7baa9a4c..bc8ee08e 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/mixtral/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/mixtral/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install PyTorch CPU as default diff --git a/python/llm/example/CPU/PyTorch-Models/Model/openai-whisper/readme.md b/python/llm/example/CPU/PyTorch-Models/Model/openai-whisper/readme.md index 85f6594a..a1def711 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/openai-whisper/readme.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/openai-whisper/readme.md @@ -10,7 +10,7 @@ In the example [recognize.py](./recognize.py), we show a basic use case for a Wh ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/phi-1_5/README.md b/python/llm/example/CPU/PyTorch-Models/Model/phi-1_5/README.md index 236cee37..3b4dfac1 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/phi-1_5/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/phi-1_5/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md b/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md index c9e8daaf..81355b62 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/phi-2/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/phixtral/README.md b/python/llm/example/CPU/PyTorch-Models/Model/phixtral/README.md index c3a19031..9f824fad 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/phixtral/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/phixtral/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/qwen-vl/README.md b/python/llm/example/CPU/PyTorch-Models/Model/qwen-vl/README.md index 57ccdf71..0e2c21cf 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/qwen-vl/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/qwen-vl/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/qwen1.5/README.md b/python/llm/example/CPU/PyTorch-Models/Model/qwen1.5/README.md index a404cf03..095ee001 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/qwen1.5/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/qwen1.5/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/skywork/README.md b/python/llm/example/CPU/PyTorch-Models/Model/skywork/README.md index 1221f2a3..b1b21407 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/skywork/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/skywork/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/solar/README.md b/python/llm/example/CPU/PyTorch-Models/Model/solar/README.md index 0625fb2f..44c2ae4b 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/solar/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/solar/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/stablelm/README.md b/python/llm/example/CPU/PyTorch-Models/Model/stablelm/README.md index 6332f063..8934e3f8 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/stablelm/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/stablelm/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/wizardcoder-python/README.md b/python/llm/example/CPU/PyTorch-Models/Model/wizardcoder-python/README.md index 7cfa8d11..e4f99c47 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/wizardcoder-python/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/wizardcoder-python/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/yi/README.md b/python/llm/example/CPU/PyTorch-Models/Model/yi/README.md index cb4d06a9..89adf93a 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/yi/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/yi/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/yuan2/README.md b/python/llm/example/CPU/PyTorch-Models/Model/yuan2/README.md index c268f7a3..3627e815 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/yuan2/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/yuan2/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Model/ziya/README.md b/python/llm/example/CPU/PyTorch-Models/Model/ziya/README.md index 2a77221a..79ac293d 100644 --- a/python/llm/example/CPU/PyTorch-Models/Model/ziya/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Model/ziya/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/More-Data-Types/README.md b/python/llm/example/CPU/PyTorch-Models/More-Data-Types/README.md index 461cb983..4bbfb55e 100644 --- a/python/llm/example/CPU/PyTorch-Models/More-Data-Types/README.md +++ b/python/llm/example/CPU/PyTorch-Models/More-Data-Types/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case of low-bit ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/PyTorch-Models/Save-Load/README.md b/python/llm/example/CPU/PyTorch-Models/Save-Load/README.md index ae8c0302..f3bbb5cf 100644 --- a/python/llm/example/CPU/PyTorch-Models/Save-Load/README.md +++ b/python/llm/example/CPU/PyTorch-Models/Save-Load/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case of saving/ ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option diff --git a/python/llm/example/CPU/QLoRA-FineTuning/README.md b/python/llm/example/CPU/QLoRA-FineTuning/README.md index 33543c12..88106180 100644 --- a/python/llm/example/CPU/QLoRA-FineTuning/README.md +++ b/python/llm/example/CPU/QLoRA-FineTuning/README.md @@ -16,7 +16,7 @@ This example is ported from [bnb-4bit-training](https://colab.research.google.co ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install transformers==4.34.0 diff --git a/python/llm/example/CPU/QLoRA-FineTuning/alpaca-qlora/README.md b/python/llm/example/CPU/QLoRA-FineTuning/alpaca-qlora/README.md index edd4d08d..9641d356 100644 --- a/python/llm/example/CPU/QLoRA-FineTuning/alpaca-qlora/README.md +++ b/python/llm/example/CPU/QLoRA-FineTuning/alpaca-qlora/README.md @@ -5,7 +5,7 @@ This example ports [Alpaca-LoRA](https://github.com/tloen/alpaca-lora/tree/main) ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install datasets transformers==4.35.0 diff --git a/python/llm/example/CPU/Speculative-Decoding/baichuan2/README.md b/python/llm/example/CPU/Speculative-Decoding/baichuan2/README.md index 35c0fab6..91f2ca9d 100644 --- a/python/llm/example/CPU/Speculative-Decoding/baichuan2/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/baichuan2/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install intel_extension_for_pytorch==2.1.0 diff --git a/python/llm/example/CPU/Speculative-Decoding/chatglm3/README.md b/python/llm/example/CPU/Speculative-Decoding/chatglm3/README.md index 7d4a2e24..333a6263 100644 --- a/python/llm/example/CPU/Speculative-Decoding/chatglm3/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/chatglm3/README.md @@ -7,7 +7,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] ``` diff --git a/python/llm/example/CPU/Speculative-Decoding/llama2/README.md b/python/llm/example/CPU/Speculative-Decoding/llama2/README.md index 4f76831d..34646bcc 100644 --- a/python/llm/example/CPU/Speculative-Decoding/llama2/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/llama2/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install intel_extension_for_pytorch==2.1.0 diff --git a/python/llm/example/CPU/Speculative-Decoding/mistral/README.md b/python/llm/example/CPU/Speculative-Decoding/mistral/README.md index 0f6c0762..6f824d2b 100644 --- a/python/llm/example/CPU/Speculative-Decoding/mistral/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/mistral/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install intel_extension_for_pytorch==2.1.0 diff --git a/python/llm/example/CPU/Speculative-Decoding/qwen/README.md b/python/llm/example/CPU/Speculative-Decoding/qwen/README.md index e00d73f7..ec5866f0 100644 --- a/python/llm/example/CPU/Speculative-Decoding/qwen/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/qwen/README.md @@ -8,7 +8,7 @@ predict the next N tokens using `generate()` API, with IPEX-LLM speculative deco ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install tiktoken einops transformers_stream_generator # additional package required for Qwen to conduct generation diff --git a/python/llm/example/CPU/Speculative-Decoding/starcoder/README.md b/python/llm/example/CPU/Speculative-Decoding/starcoder/README.md index dcb42d99..eab5fd8a 100644 --- a/python/llm/example/CPU/Speculative-Decoding/starcoder/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/starcoder/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install intel_extension_for_pytorch==2.1.0 diff --git a/python/llm/example/CPU/Speculative-Decoding/vicuna/README.md b/python/llm/example/CPU/Speculative-Decoding/vicuna/README.md index faf31eb0..bd85910f 100644 --- a/python/llm/example/CPU/Speculative-Decoding/vicuna/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/vicuna/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install intel_extension_for_pytorch==2.1.0 diff --git a/python/llm/example/CPU/Speculative-Decoding/ziya/README.md b/python/llm/example/CPU/Speculative-Decoding/ziya/README.md index 769b5519..837aa357 100644 --- a/python/llm/example/CPU/Speculative-Decoding/ziya/README.md +++ b/python/llm/example/CPU/Speculative-Decoding/ziya/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] pip install intel_extension_for_pytorch==2.1.0 diff --git a/python/llm/example/CPU/vLLM-Serving/README.md b/python/llm/example/CPU/vLLM-Serving/README.md index c4e4c2bc..b7933112 100644 --- a/python/llm/example/CPU/vLLM-Serving/README.md +++ b/python/llm/example/CPU/vLLM-Serving/README.md @@ -14,7 +14,7 @@ To run vLLM continuous batching on Intel CPUs, install the dependencies as follo ```bash # First create an conda environment -conda create -n ipex-vllm python==3.9 +conda create -n ipex-vllm python=3.11 conda activate ipex-vllm # Install dependencies pip3 install numpy diff --git a/python/llm/example/GPU/Applications/autogen/README.md b/python/llm/example/GPU/Applications/autogen/README.md index 2a9f8328..9ae4104c 100644 --- a/python/llm/example/GPU/Applications/autogen/README.md +++ b/python/llm/example/GPU/Applications/autogen/README.md @@ -11,7 +11,7 @@ mkdir autogen cd autogen # create respective conda environment -conda create -n autogen python=3.9 +conda create -n autogen python=3.11 conda activate autogen # install xpu-supported and fastchat-adapted ipex-llm diff --git a/python/llm/example/GPU/Applications/streaming-llm/README.md b/python/llm/example/GPU/Applications/streaming-llm/README.md index ae0e1aa7..4e1fd1ad 100644 --- a/python/llm/example/GPU/Applications/streaming-llm/README.md +++ b/python/llm/example/GPU/Applications/streaming-llm/README.md @@ -10,7 +10,7 @@ model = AutoModelForCausalLM.from_pretrained(model_name_or_path, load_in_4bit=Tr ## Prepare Environment We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install -U transformers==4.34.0 pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Deepspeed-AutoTP/README.md b/python/llm/example/GPU/Deepspeed-AutoTP/README.md index 948bf8c5..aa408d4e 100644 --- a/python/llm/example/GPU/Deepspeed-AutoTP/README.md +++ b/python/llm/example/GPU/Deepspeed-AutoTP/README.md @@ -10,7 +10,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md index 59355f71..cf281a8f 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md @@ -33,7 +33,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a AWQ We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF-IQ2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF-IQ2/README.md index c90522f0..27ace787 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF-IQ2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF-IQ2/README.md @@ -23,7 +23,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a GGUF We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md index c0101fde..a979d5f6 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md @@ -23,7 +23,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md index d9507532..742ba6ec 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md @@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila/README.md index 10c44883..f2b57eb4 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila/README.md @@ -16,7 +16,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Aqui #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -24,7 +24,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila2/README.md index 689d3821..b68ff6df 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/aquila2/README.md @@ -16,7 +16,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Aqui #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -25,7 +25,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan/README.md index dbebb1d4..105e1c0b 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Baic #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install transformers_stream_generator # additional package required for Bai #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan2/README.md index 502ae4ac..d7de8ab0 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/baichuan2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Baic #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install transformers_stream_generator # additional package required for Bai #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/bluelm/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/bluelm/README.md index a075bbf2..af784432 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/bluelm/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/bluelm/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Blue #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm2/README.md index 9a6af846..9f7fbcb7 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm2/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -137,7 +137,7 @@ In the example [streamchat.py](./streamchat.py), we show a basic use case for a #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -145,7 +145,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm3/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm3/README.md index 8087252e..607a7a33 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm3/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chatglm3/README.md @@ -11,7 +11,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -138,7 +138,7 @@ In the example [streamchat.py](./streamchat.py), we show a basic use case for a #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -147,7 +147,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chinese-llama2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chinese-llama2/README.md index 68bf861f..08c49e99 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chinese-llama2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/chinese-llama2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/codellama/readme.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/codellama/readme.md index c19a9c71..f5f31406 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/codellama/readme.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/codellama/readme.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for an Cod #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install transformers==4.34.1 # CodeLlamaTokenizer is supported in higher ver #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md index e3da7af0..dd69d009 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage environment. For more information about conda i After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.0.110+xpu as default # you can install specific ipex/torch version for your need @@ -23,7 +23,7 @@ pip install transformers==4.35.2 # required by DeciLM-7B #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deepseek/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deepseek/README.md index 45ba0849..d747ba53 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deepseek/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/deepseek/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Deep #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.0.110+xpu as default # you can install specific ipex/torch version for your need @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md index a3bef032..664e67aa 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -23,7 +23,7 @@ pip install datasets soundfile librosa # required by audio processing #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v1/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v1/README.md index ebcf31b2..027ff4e8 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v1/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v1/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Doll #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v2/README.md index 7a73f8c2..5ab0cf0e 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/dolly-v2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Doll #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/falcon/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/falcon/README.md index 8d415381..c5e96f1c 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/falcon/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/falcon/README.md @@ -11,7 +11,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Falc #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install einops # additional package required for falcon-7b-instruct to condu #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/flan-t5/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/flan-t5/README.md index 51d750b3..f73665f6 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/flan-t5/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/flan-t5/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gemma/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gemma/README.md index 99db8511..98b775f1 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gemma/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gemma/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ pip install transformers==4.38.1 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gpt-j/readme.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gpt-j/readme.md index dcf79586..c8659217 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gpt-j/readme.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/gpt-j/readme.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a GPT- #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm/README.md index 0b35a40e..c784dedb 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Inte #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm2/README.md index d58d103e..a6e32dd8 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/internlm2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Inte #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/llama2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/llama2/README.md index dbeb9520..97b6deee 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/llama2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/llama2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mistral/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mistral/README.md index 4dd1bac0..78413419 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mistral/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mistral/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ pip install transformers==4.34.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mixtral/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mixtral/README.md index d87c8bab..47c9e728 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mixtral/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mixtral/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ pip install transformers==4.36.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mpt/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mpt/README.md index e9bea490..99092cf9 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mpt/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/mpt/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for an MPT #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install einops # additional package required for mpt-7b-chat and mpt-30b-ch #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md index 198e73ba..98868833 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-1_5/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a phi- #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install einops # additional package required for phi-1_5 to conduct generati #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md index f7030b26..353d6e51 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phi-2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a phi- #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install einops # additional package required for phi-2 to conduct generation #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phixtral/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phixtral/README.md index 7a05488d..e91daf29 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phixtral/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/phixtral/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Inte #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md index fe044d10..f8d67544 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen-vl/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -22,7 +22,7 @@ pip install accelerate tiktoken einops transformers_stream_generator==0.0.4 scip #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen/README.md index 7b20fcf1..b475d831 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Qwen #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install tiktoken einops transformers_stream_generator # additional package #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md index 656e8933..830d4d26 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/qwen1.5/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Qwen #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install transformers==4.37.0 # install transformers which supports Qwen2 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/redpajama/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/redpajama/README.md index ddb34896..201046af 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/redpajama/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/redpajama/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/replit/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/replit/README.md index a4626d99..9e6930a5 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/replit/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/replit/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -22,7 +22,7 @@ pip install "transformers<4.35" #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv4/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv4/README.md index b2a1ccf6..5ec3e3f0 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv4/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv4/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a RWKV #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv5/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv5/README.md index b0d783fd..c924fc25 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv5/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/rwkv5/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a RWKV #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/solar/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/solar/README.md index 34358217..72a3562d 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/solar/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/solar/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a SOLA #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install transformers==4.35.2 # required by SOLAR #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/stablelm/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/stablelm/README.md index b58df91c..ce694c49 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/stablelm/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/stablelm/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -25,7 +25,7 @@ pip install transformers==4.38.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/starcoder/readme.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/starcoder/readme.md index 41ddf26c..d0c6a257 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/starcoder/readme.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/starcoder/readme.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for an Sta #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/vicuna/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/vicuna/README.md index f53ecb71..9b719625 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/vicuna/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/vicuna/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Vicu #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/voiceassistant/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/voiceassistant/README.md index 07d0d4af..f34731df 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/voiceassistant/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/voiceassistant/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Whis #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -26,7 +26,7 @@ pip install PyAudio inquirer sounddevice #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/whisper/readme.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/whisper/readme.md index dd684114..377b8592 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/whisper/readme.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/whisper/readme.md @@ -11,7 +11,7 @@ In the example [recognize.py](./recognize.py), we show a basic use case for a Wh #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install datasets soundfile librosa # required by audio processing #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yi/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yi/README.md index 6995e24b..cb020717 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yi/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yi/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install einops # additional package required for Yi-6B to conduct generation #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yuan2/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yuan2/README.md index b0a66413..d67ac916 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yuan2/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Model/yuan2/README.md @@ -12,7 +12,7 @@ In the example [generate.py](./generate.py), we show a basic use case for an Yua #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option @@ -22,7 +22,7 @@ pip install pandas # additional package required for Yuan2 to conduct generation #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/More-Data-Types/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/More-Data-Types/README.md index 2a8a7661..d97d0e40 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/More-Data-Types/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/More-Data-Types/README.md @@ -5,7 +5,7 @@ In this example, we show a pipeline to apply IPEX-LLM low-bit optimizations (inc ## Prepare Environment We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default diff --git a/python/llm/example/GPU/HF-Transformers-AutoModels/Save-Load/README.md b/python/llm/example/GPU/HF-Transformers-AutoModels/Save-Load/README.md index 53c38b13..f9849ff8 100644 --- a/python/llm/example/GPU/HF-Transformers-AutoModels/Save-Load/README.md +++ b/python/llm/example/GPU/HF-Transformers-AutoModels/Save-Load/README.md @@ -11,7 +11,7 @@ In the example [generate.py](./generate.py), we show a basic use case of saving/ #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/DPO/README.md b/python/llm/example/GPU/LLM-Finetuning/DPO/README.md index eeed9519..076e5642 100644 --- a/python/llm/example/GPU/LLM-Finetuning/DPO/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/DPO/README.md @@ -13,7 +13,7 @@ This example is ported from [Fine_tune_a_Mistral_7b_model_with_DPO](https://gith ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/HF-PEFT/README.md b/python/llm/example/GPU/LLM-Finetuning/HF-PEFT/README.md index b847fdce..7da65981 100644 --- a/python/llm/example/GPU/LLM-Finetuning/HF-PEFT/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/HF-PEFT/README.md @@ -10,7 +10,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/LoRA/README.md b/python/llm/example/GPU/LLM-Finetuning/LoRA/README.md index 4af01ab0..8ef75a28 100644 --- a/python/llm/example/GPU/LLM-Finetuning/LoRA/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/LoRA/README.md @@ -8,7 +8,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/QA-LoRA/README.md b/python/llm/example/GPU/LLM-Finetuning/QA-LoRA/README.md index 5ab124f0..006f6630 100644 --- a/python/llm/example/GPU/LLM-Finetuning/QA-LoRA/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/QA-LoRA/README.md @@ -8,7 +8,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/QLoRA/alpaca-qlora/README.md b/python/llm/example/GPU/LLM-Finetuning/QLoRA/alpaca-qlora/README.md index 9893c763..4cdb3d26 100644 --- a/python/llm/example/GPU/LLM-Finetuning/QLoRA/alpaca-qlora/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/QLoRA/alpaca-qlora/README.md @@ -10,7 +10,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/QLoRA/simple-example/README.md b/python/llm/example/GPU/LLM-Finetuning/QLoRA/simple-example/README.md index 15b63674..fe682829 100644 --- a/python/llm/example/GPU/LLM-Finetuning/QLoRA/simple-example/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/QLoRA/simple-example/README.md @@ -13,7 +13,7 @@ This example is referred to [bnb-4bit-training](https://colab.research.google.co ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/QLoRA/trl-example/README.md b/python/llm/example/GPU/LLM-Finetuning/QLoRA/trl-example/README.md index 0ba053f8..46e8992b 100644 --- a/python/llm/example/GPU/LLM-Finetuning/QLoRA/trl-example/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/QLoRA/trl-example/README.md @@ -13,7 +13,7 @@ This example utilizes a subset of [yahma/alpaca-cleaned](https://huggingface.co/ ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/LLM-Finetuning/ReLora/README.md b/python/llm/example/GPU/LLM-Finetuning/ReLora/README.md index 3218948b..0e94a63a 100644 --- a/python/llm/example/GPU/LLM-Finetuning/ReLora/README.md +++ b/python/llm/example/GPU/LLM-Finetuning/ReLora/README.md @@ -8,7 +8,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1. Install ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Long-Context/LLaMA2-32K/README.md b/python/llm/example/GPU/Long-Context/LLaMA2-32K/README.md index 05c0661b..677b4742 100644 --- a/python/llm/example/GPU/Long-Context/LLaMA2-32K/README.md +++ b/python/llm/example/GPU/Long-Context/LLaMA2-32K/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/ModelScope-Models/README.md b/python/llm/example/GPU/ModelScope-Models/README.md index 331638a3..fe3227c2 100644 --- a/python/llm/example/GPU/ModelScope-Models/README.md +++ b/python/llm/example/GPU/ModelScope-Models/README.md @@ -11,7 +11,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -22,7 +22,7 @@ pip install modelscope==1.11.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/ModelScope-Models/Save-Load/README.md b/python/llm/example/GPU/ModelScope-Models/Save-Load/README.md index 2dfcc238..33b1b900 100644 --- a/python/llm/example/GPU/ModelScope-Models/Save-Load/README.md +++ b/python/llm/example/GPU/ModelScope-Models/Save-Load/README.md @@ -11,7 +11,7 @@ In the example [generate.py](./generate.py), we show a basic use case of saving/ #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install modelscope==1.11.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Pipeline-Parallel-Inference/README.md b/python/llm/example/GPU/Pipeline-Parallel-Inference/README.md index 7162b757..58379184 100644 --- a/python/llm/example/GPU/Pipeline-Parallel-Inference/README.md +++ b/python/llm/example/GPU/Pipeline-Parallel-Inference/README.md @@ -10,7 +10,7 @@ To run this example with IPEX-LLM on Intel GPUs, we have some recommended requir ### 1.1 Install IPEX-LLM ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default # you can install specific ipex/torch version for your need diff --git a/python/llm/example/GPU/PyTorch-Models/Model/aquila2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/aquila2/README.md index a9597f97..32da14ea 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/aquila2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/aquila2/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/baichuan/README.md b/python/llm/example/GPU/PyTorch-Models/Model/baichuan/README.md index ce470ec9..be7501ec 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/baichuan/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/baichuan/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install transformers_stream_generator # additional package required for Bai #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/baichuan2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/baichuan2/README.md index fdf78524..11e5dad8 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/baichuan2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/baichuan2/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install transformers_stream_generator # additional package required for Bai #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/bark/README.md b/python/llm/example/GPU/PyTorch-Models/Model/bark/README.md index 07d9411a..05f34949 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/bark/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/bark/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install scipy #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default diff --git a/python/llm/example/GPU/PyTorch-Models/Model/bluelm/README.md b/python/llm/example/GPU/PyTorch-Models/Model/bluelm/README.md index 8eac3142..fc6f47fb 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/bluelm/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/bluelm/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/chatglm2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/chatglm2/README.md index 72c0e775..afda5bb6 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/chatglm2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/chatglm2/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -136,7 +136,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -146,7 +146,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/chatglm3/README.md b/python/llm/example/GPU/PyTorch-Models/Model/chatglm3/README.md index df8ed461..278888b9 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/chatglm3/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/chatglm3/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -135,7 +135,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -145,7 +145,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/codellama/README.md b/python/llm/example/GPU/PyTorch-Models/Model/codellama/README.md index 0c9ac640..01115cef 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/codellama/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/codellama/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install transformers==4.34.1 # CodeLlamaTokenizer is supported in higher ver #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/deciLM-7b/README.md b/python/llm/example/GPU/PyTorch-Models/Model/deciLM-7b/README.md index 01206c19..644c0205 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/deciLM-7b/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/deciLM-7b/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.0.110+xpu as default @@ -25,7 +25,7 @@ pip install transformers==4.35.2 # required by DeciLM-7B #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/deepseek/README.md b/python/llm/example/GPU/PyTorch-Models/Model/deepseek/README.md index 55d5eaab..d3c76f9f 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/deepseek/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/deepseek/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.0.110+xpu as default @@ -23,7 +23,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/distil-whisper/README.md b/python/llm/example/GPU/PyTorch-Models/Model/distil-whisper/README.md index 9de7587b..d72abcf3 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/distil-whisper/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/distil-whisper/README.md @@ -13,7 +13,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -23,7 +23,7 @@ pip install datasets soundfile librosa # required by audio processing #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/dolly-v1/README.md b/python/llm/example/GPU/PyTorch-Models/Model/dolly-v1/README.md index 6a67390c..4f80a814 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/dolly-v1/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/dolly-v1/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/dolly-v2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/dolly-v2/README.md index 24871ddb..28dab67b 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/dolly-v2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/dolly-v2/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/flan-t5/README.md b/python/llm/example/GPU/PyTorch-Models/Model/flan-t5/README.md index 84714a32..d42a7cb2 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/flan-t5/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/flan-t5/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/internlm2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/internlm2/README.md index d58d103e..a6e32dd8 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/internlm2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/internlm2/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Inte #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/llama2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/llama2/README.md index ab29daa6..b801c7fb 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/llama2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/llama2/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/llava/README.md b/python/llm/example/GPU/PyTorch-Models/Model/llava/README.md index aff37cd1..4eefd714 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/llava/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/llava/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ cd LLaVA # change the working directory to the LLaVA folder #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/mamba/README.md b/python/llm/example/GPU/PyTorch-Models/Model/mamba/README.md index 085e440d..7c30497a 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/mamba/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/mamba/README.md @@ -11,7 +11,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.0.110+xpu as default diff --git a/python/llm/example/GPU/PyTorch-Models/Model/mistral/README.md b/python/llm/example/GPU/PyTorch-Models/Model/mistral/README.md index 8fdaa738..565470e5 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/mistral/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/mistral/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ pip install transformers==4.34.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/mixtral/README.md b/python/llm/example/GPU/PyTorch-Models/Model/mixtral/README.md index d617ed4e..8f4a4dab 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/mixtral/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/mixtral/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -27,7 +27,7 @@ pip install transformers==4.36.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/phi-1_5/README.md b/python/llm/example/GPU/PyTorch-Models/Model/phi-1_5/README.md index 3a45012b..54a72a07 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/phi-1_5/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/phi-1_5/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -22,7 +22,7 @@ pip install einops # additional package required for phi-1_5 to conduct generati #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md index 0f6c8bbf..4a201625 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/phi-2/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -21,7 +21,7 @@ pip install einops # additional package required for phi-2 to conduct generation #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/phixtral/README.md b/python/llm/example/GPU/PyTorch-Models/Model/phixtral/README.md index 61743b12..9f1a33be 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/phixtral/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/phixtral/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -22,7 +22,7 @@ pip install einops # additional package required for phixtral to conduct generat #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/qwen-vl/README.md b/python/llm/example/GPU/PyTorch-Models/Model/qwen-vl/README.md index 80a65a59..473cac40 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/qwen-vl/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/qwen-vl/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -22,7 +22,7 @@ pip install accelerate tiktoken einops transformers_stream_generator==0.0.4 scip #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/qwen1.5/README.md b/python/llm/example/GPU/PyTorch-Models/Model/qwen1.5/README.md index daed4390..86b0f8c7 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/qwen1.5/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/qwen1.5/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Qwen #### 1.1 Installation on Linux We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ @@ -20,7 +20,7 @@ pip install transformers==4.37.0 # install transformers which supports Qwen2 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/replit/README.md b/python/llm/example/GPU/PyTorch-Models/Model/replit/README.md index f9c19c19..8ad73633 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/replit/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/replit/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install "transformers<4.35" #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/solar/README.md b/python/llm/example/GPU/PyTorch-Models/Model/solar/README.md index 6eb6f052..e0802db7 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/solar/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/solar/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install transformers==4.35.2 # required by SOLAR #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/speech-t5/README.md b/python/llm/example/GPU/PyTorch-Models/Model/speech-t5/README.md index 239877c6..a0a1020c 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/speech-t5/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/speech-t5/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install "datasets<2.18" soundfile # additional package required for SpeechT5 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default diff --git a/python/llm/example/GPU/PyTorch-Models/Model/stablelm/README.md b/python/llm/example/GPU/PyTorch-Models/Model/stablelm/README.md index 656195b1..f322d64f 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/stablelm/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/stablelm/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -25,7 +25,7 @@ pip install transformers==4.38.0 #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/starcoder/README.md b/python/llm/example/GPU/PyTorch-Models/Model/starcoder/README.md index ae0eee66..9580c1a8 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/starcoder/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/starcoder/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -22,7 +22,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/yi/README.md b/python/llm/example/GPU/PyTorch-Models/Model/yi/README.md index 4562972e..bac21baf 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/yi/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/yi/README.md @@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 # recommend to use Python 3.9 +conda create -n llm python=3.11 # recommend to use Python 3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default @@ -23,7 +23,7 @@ pip install einops # additional package required for Yi-6B to conduct generation #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Model/yuan2/README.md b/python/llm/example/GPU/PyTorch-Models/Model/yuan2/README.md index c5364a42..2def531d 100644 --- a/python/llm/example/GPU/PyTorch-Models/Model/yuan2/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Model/yuan2/README.md @@ -14,7 +14,7 @@ We suggest using conda to manage the Python environment. For more information ab After installing conda, create a Python environment for IPEX-LLM: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option @@ -24,7 +24,7 @@ pip install pandas # additional package required for Yuan2 to conduct generation #### 1.2 Installation on Windows We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 libuv +conda create -n llm python=3.11 libuv conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/More-Data-Types/README.md b/python/llm/example/GPU/PyTorch-Models/More-Data-Types/README.md index e3b223df..4a739e55 100644 --- a/python/llm/example/GPU/PyTorch-Models/More-Data-Types/README.md +++ b/python/llm/example/GPU/PyTorch-Models/More-Data-Types/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case of low-bit ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/PyTorch-Models/Save-Load/README.md b/python/llm/example/GPU/PyTorch-Models/Save-Load/README.md index 93962516..0efc1af2 100644 --- a/python/llm/example/GPU/PyTorch-Models/Save-Load/README.md +++ b/python/llm/example/GPU/PyTorch-Models/Save-Load/README.md @@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case of saving/ ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Speculative-Decoding/baichuan2/README.md b/python/llm/example/GPU/Speculative-Decoding/baichuan2/README.md index 8f82d35f..2f9fd573 100644 --- a/python/llm/example/GPU/Speculative-Decoding/baichuan2/README.md +++ b/python/llm/example/GPU/Speculative-Decoding/baichuan2/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Speculative-Decoding/chatglm3/README.md b/python/llm/example/GPU/Speculative-Decoding/chatglm3/README.md index eec1f6ed..8766bf3d 100644 --- a/python/llm/example/GPU/Speculative-Decoding/chatglm3/README.md +++ b/python/llm/example/GPU/Speculative-Decoding/chatglm3/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Speculative-Decoding/gpt-j/README.md b/python/llm/example/GPU/Speculative-Decoding/gpt-j/README.md index 9ec03e5e..9f82533a 100644 --- a/python/llm/example/GPU/Speculative-Decoding/gpt-j/README.md +++ b/python/llm/example/GPU/Speculative-Decoding/gpt-j/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Speculative-Decoding/llama2/README.md b/python/llm/example/GPU/Speculative-Decoding/llama2/README.md index a8648c1d..d25f77c6 100644 --- a/python/llm/example/GPU/Speculative-Decoding/llama2/README.md +++ b/python/llm/example/GPU/Speculative-Decoding/llama2/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Speculative-Decoding/mistral/README.md b/python/llm/example/GPU/Speculative-Decoding/mistral/README.md index eebad70a..12fbeb41 100644 --- a/python/llm/example/GPU/Speculative-Decoding/mistral/README.md +++ b/python/llm/example/GPU/Speculative-Decoding/mistral/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/Speculative-Decoding/qwen/README.md b/python/llm/example/GPU/Speculative-Decoding/qwen/README.md index 40607d1f..515aaf7b 100644 --- a/python/llm/example/GPU/Speculative-Decoding/qwen/README.md +++ b/python/llm/example/GPU/Speculative-Decoding/qwen/README.md @@ -9,7 +9,7 @@ In the example [speculative.py](./speculative.py), we show a basic use case for ### 1. Install We suggest using conda to manage environment: ```bash -conda create -n llm python=3.9 +conda create -n llm python=3.11 conda activate llm # below command will install intel_extension_for_pytorch==2.1.10+xpu as default pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ diff --git a/python/llm/example/GPU/vLLM-Serving/README.md b/python/llm/example/GPU/vLLM-Serving/README.md index 02f72379..92079c89 100644 --- a/python/llm/example/GPU/vLLM-Serving/README.md +++ b/python/llm/example/GPU/vLLM-Serving/README.md @@ -31,7 +31,7 @@ To run vLLM continuous batching on Intel GPUs, install the dependencies as follo ```bash # First create an conda environment -conda create -n ipex-vllm python==3.9 +conda create -n ipex-vllm python=3.11 conda activate ipex-vllm # Install dependencies pip3 install psutil diff --git a/python/llm/scripts/env-check.sh b/python/llm/scripts/env-check.sh index 7169858e..fef33837 100644 --- a/python/llm/scripts/env-check.sh +++ b/python/llm/scripts/env-check.sh @@ -20,7 +20,7 @@ check_python() retval="0" fi else - echo "No Python found! Please use `conda create -n llm python=3.9` to create environment. More details could be found in the README.md" + echo "No Python found! Please use `conda create -n llm python=3.11` to create environment. More details could be found in the README.md" retval="1" fi return "$retval"