ipex-llm/docs/mddocs/Quickstart/bigdl_llm_migration.md
Jin Qiao 9a3a21e4fc
Update part of Quickstart guide in mddocs (2/2) (#11376)
* axolotl_quickstart.md

* benchmark_quickstart.md

* bigdl_llm_migration.md

* chatchat_quickstart.md

* continue_quickstart.md

* deepspeed_autotp_fastapi_quickstart.md

* dify_quickstart.md

* fastchat_quickstart.md

* adjust tab style

* fix link

* fix link

* add video preview

* Small fixes

* Small fix

---------

Co-authored-by: Yuwen Hu <yuwen.hu@intel.com>
2024-06-20 19:03:06 +08:00

2 KiB

bigdl-llm Migration Guide

This guide helps you migrate your bigdl-llm application to use ipex-llm.

Upgrade bigdl-llm package to ipex-llm

Note

This step assumes you have already installed bigdl-llm.

You need to uninstall bigdl-llm and install ipex-llmWith your bigdl-llm conda environment activated, execute the following command according to your device type and location:

For CPU

pip uninstall -y bigdl-llm
pip install --pre --upgrade ipex-llm[all] # for cpu

For GPU

Choose either US or CN website for extra-index-url:

  • For US:

    pip uninstall -y bigdl-llm
    pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
    
  • For CN:

    pip uninstall -y bigdl-llm
    pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
    

Migrate bigdl-llm code to ipex-llm

There are two options to migrate bigdl-llm code to ipex-llm.

1. Upgrade bigdl-llm code to ipex-llm

To upgrade bigdl-llm code to ipex-llm, simply replace all bigdl.llm with ipex_llm:

#from bigdl.llm.transformers import AutoModelForCausalLM # Original line
from ipex_llm.transformers import AutoModelForCausalLM #Updated line
model = AutoModelForCausalLM.from_pretrained(model_path,
                                             load_in_4bit=True,
                                             trust_remote_code=True)

2. Run bigdl-llm code in compatible mode (experimental)

To run in the compatible mode, simply add import ipex_llm at the beginning of the existing bigdl-llm code:

import ipex_llm # Add this line before any bigdl.llm imports
from bigdl.llm.transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained(model_path,
                                             load_in_4bit=True,
                                             trust_remote_code=True)