ipex-llm/python/llm/test
Yuwen Hu 4faf5af8f1 [LLM] Add perf test for core on Windows (#9397)
* temporary stop other perf test

* Add framework for core performance test with one test model

* Small fix and add platform control

* Comment out lp for now

* Add missing ymal file

* Small fix

* Fix sed contents

* Small fix

* Small path fixes

* Small fix

* Add update to ftp

* Small upload fix

* add chatglm3-6b

* LLM: add model names

* Keep repo id same as ftp and temporary make baichuan2 first priority

* change order

* Remove temp if false and separate pr and nightly results

* Small fix

---------

Co-authored-by: jinbridge <2635480475@qq.com>
2023-11-13 13:58:40 +08:00
..
benchmark [LLM] Add perf test for core on Windows (#9397) 2023-11-13 13:58:40 +08:00
convert LLM: Adapt transformers models for optimize model SL (#9022) 2023-10-09 11:13:44 +08:00
inference [WIP] Add UT for Mistral Optimized Model (#9248) 2023-10-25 15:14:17 +08:00
inference_gpu [LLM] Add model correctness test on ARC for llama and falcon (#9347) 2023-11-10 13:48:57 +08:00
install [LLM] Refactor LLM Linux tests (#8349) 2023-06-16 15:22:48 +08:00
langchain [LLM] Unify Langchain Native and Transformers LLM API (#8752) 2023-08-25 11:14:21 +08:00
win [LLM] Remove old windows nightly test code (#8668) 2023-08-03 17:12:23 +09:00
__init__.py [LLM] Enable UT workflow logics for LLM (#8243) 2023-06-02 17:06:35 +08:00
run-llm-convert-tests.sh [LLM] Change default runner for LLM Linux tests to the ones with AVX512 (#8448) 2023-07-04 14:53:03 +08:00
run-llm-example-tests-gpu.sh Add test script and workflow for qlora fine-tuning (#9295) 2023-11-01 09:39:53 +08:00
run-llm-inference-tests-gpu.sh [LLM] Add model correctness test on ARC for llama and falcon (#9347) 2023-11-10 13:48:57 +08:00
run-llm-inference-tests.sh [WIP] Add UT for Mistral Optimized Model (#9248) 2023-10-25 15:14:17 +08:00
run-llm-install-tests.sh [LLM] Refactor LLM Linux tests (#8349) 2023-06-16 15:22:48 +08:00
run-llm-langchain-tests.sh [LLM] langchain bloom, UT's, default parameters (#8357) 2023-06-25 17:38:00 +08:00
run-llm-windows-tests.sh LLM: fix langchain windows failure (#8417) 2023-06-30 09:59:10 +08:00