Chronos: how to speedup inference on one node (onnx, openvino) (#5556)

* how to speedup inference on one node

* index added

* fixed

* fixed

* split to 2 guides

* add build_onnx and build_openvino

* add note border

* fix

* fix colab conflict

Co-authored-by: binbin <binbin1.deng@intel.com>
This commit is contained in:
binbin Deng 2022-09-07 17:53:38 +08:00 committed by GitHub
parent 46b0fda3ca
commit 103546d8fe
3 changed files with 500 additions and 0 deletions

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View file

@ -16,6 +16,14 @@ Forecasting
In this guidance, we demonstrate **how to tune forecaster on single node**. In tuning process, forecaster will find the best hyperparameter combination among user-defined search space, which is a common process if users pursue a forecaster with higher accuracy. In this guidance, we demonstrate **how to tune forecaster on single node**. In tuning process, forecaster will find the best hyperparameter combination among user-defined search space, which is a common process if users pursue a forecaster with higher accuracy.
* `Speed up inference of forecaster through ONNXRuntime <how_to_speedup_inference_of_forecaster_through_ONNXRuntime.html>`__
In this guidance, **we demonstrate how to speed up inference of forecaster through ONNXRuntime**. In inferencing process, Chronos supports ONNXRuntime to accelerate inferencing which is helpful to users.
* `Speed up inference of forecaster through OpenVINO <how_to_speedup_inference_of_forecaster_through_OpenVINO.html>`__
In this guidance, **we demonstrate how to speed up inference of forecaster through OpenVINO**. In inferencing process, Chronos supports OpenVINO to accelerate inferencing which is helpful to users.
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
@ -24,3 +32,6 @@ Forecasting
how-to-create-forecaster how-to-create-forecaster
how_to_train_forecaster_on_one_node how_to_train_forecaster_on_one_node
how_to_tune_forecaster_model how_to_tune_forecaster_model
how_to_speedup_inference_of_forecaster_through_ONNXRuntime
how_to_speedup_inference_of_forecaster_through_OpenVINO