ipex-llm/docs/readthedocs/source/doc/LLM/Quickstart/index.rst
Ovo233 97c626d76f
add continue quickstart (#10610)
Co-authored-by: Shengsheng Huang <shengsheng.huang@intel.com>
2024-04-03 14:50:11 +08:00

21 lines
872 B
ReStructuredText

IPEX-LLM Quickstart
================================
.. note::
We are adding more Quickstart guide.
This section includes efficient guide to show you how to:
* |bigdl_llm_migration_guide|_
* `Install IPEX-LLM on Linux with Intel GPU <./install_linux_gpu.html>`_
* `Install IPEX-LLM on Windows with Intel GPU <./install_windows_gpu.html>`_
* `Install IPEX-LLM in Docker on Windows with Intel GPU <./docker_windows_gpu.html>`_
* `Run Performance Benchmarking with IPEX-LLM <./benchmark_quickstart.html>`_
* `Run Code Copilot (Continue) in VSCode with Intel GPU <./continue_quickstart.html>`_
* `Run Text Generation WebUI on Intel GPU <./webui_quickstart.html>`_
* `Run llama.cpp with IPEX-LLM on Intel GPU <./llama_cpp_quickstart.html>`_
.. |bigdl_llm_migration_guide| replace:: ``bigdl-llm`` Migration Guide
.. _bigdl_llm_migration_guide: bigdl_llm_migration.html