ipex-llm/python/llm/example/GPU/HuggingFace/Multimodal
2024-09-26 17:40:22 +08:00
..
distil-whisper
glm-4v upgrade glm-4v example transformers version (#11719) 2024-08-06 14:55:09 +08:00
internvl2 add internvl2 example (#12102) 2024-09-20 16:31:54 +08:00
MiniCPM-Llama3-V-2_5 MiniCPM-V-2 & MiniCPM-Llama3-V-2_5 example updates (#11988) 2024-09-03 17:02:06 +08:00
MiniCPM-V Add example: MiniCPM-V (#11570) 2024-07-15 10:55:48 +08:00
MiniCPM-V-2 MiniCPM-V-2 & MiniCPM-Llama3-V-2_5 example updates (#11988) 2024-09-03 17:02:06 +08:00
MiniCPM-V-2_6 Update MiniCPM_V_26 GPU example with save & load (#12127) 2024-09-26 17:40:22 +08:00
phi-3-vision phi-3 on "transformers>=4.37.0,<=4.42.3" (#11534) 2024-07-17 17:19:57 -07:00
qwen-vl Update ipex-llm default transformers version to 4.37.0 (#11859) 2024-08-20 17:37:58 +08:00
qwen2-audio Add Qwen2-audio example (#11835) 2024-08-27 13:35:24 +08:00
StableDiffusion
voiceassistant Update ipex-llm default transformers version to 4.37.0 (#11859) 2024-08-20 17:37:58 +08:00
whisper Update ipex-llm default transformers version to 4.37.0 (#11859) 2024-08-20 17:37:58 +08:00
README.md

Running HuggingFace multimodal model using IPEX-LLM on Intel GPU

This folder contains examples of running multimodal models model on IPEX-LLM. Each model has its own dedicated folder, where you can find detailed instructions on how to install and run it.