ipex-llm/python/llm/example/GPU/HuggingFace/Multimodal
Xu, Shuo b0338c5529
Add --modelscope option for glm-v4 MiniCPM-V-2_6 glm-edge and internvl2 (#12583)
* Add --modelscope option for glm-v4 and MiniCPM-V-2_6

* glm-edge

* minicpm-v-2_6:don't use model_hub=modelscope when use lowbit; internvl2

---------

Co-authored-by: ATMxsp01 <shou.xu@intel.com>
2024-12-20 13:54:17 +08:00
..
distil-whisper Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
glm-4v Add --modelscope option for glm-v4 MiniCPM-V-2_6 glm-edge and internvl2 (#12583) 2024-12-20 13:54:17 +08:00
internvl2 Add --modelscope option for glm-v4 MiniCPM-V-2_6 glm-edge and internvl2 (#12583) 2024-12-20 13:54:17 +08:00
MiniCPM-Llama3-V-2_5 Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
MiniCPM-V Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
MiniCPM-V-2 Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
MiniCPM-V-2_6 Add --modelscope option for glm-v4 MiniCPM-V-2_6 glm-edge and internvl2 (#12583) 2024-12-20 13:54:17 +08:00
phi-3-vision Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
qwen-vl Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
qwen2-audio Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
StableDiffusion Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
voiceassistant Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
whisper Remove env variable BIGDL_LLM_XMX_DISABLED in documentation (#12445) 2024-11-27 11:16:36 +08:00
README.md Update GPU HF-Transformers example structure (#11526) 2024-07-08 17:58:06 +08:00

Running HuggingFace multimodal model using IPEX-LLM on Intel GPU

This folder contains examples of running multimodal models model on IPEX-LLM. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.