[Nano] Openvino model inference notebook example with Nano (#5745)

* add nano notebook example for openvino ir

* add basic example for openvino model inference

* add notebook example for sync inference and async inference

* add notebook to documentation

* update explanation for async api

* try to fix code snip

* fix code snip

* simplify async api explanation

* simplify async api explanation

* adapt new theme
This commit is contained in:
Hu, Zhaojie 2022-11-16 10:10:07 +08:00 committed by GitHub
parent aff0f55b7f
commit 607db04ad7
3 changed files with 19 additions and 0 deletions

View file

@ -0,0 +1,3 @@
{
"path": "../../../../../../../../python/nano/tutorial/notebook/inference/openvino/openvino_inference.ipynb"
}

View file

@ -0,0 +1,3 @@
{
"path": "../../../../../../../../python/nano/tutorial/notebook/inference/openvino/openvino_inference_async.ipynb"
}

View file

@ -31,6 +31,19 @@ General
Inference Optimization
-------------------------
OpenVINO
~~~~~~~~~~~~~~~~~~~~~~~~~
* `How to run inference on OpenVINO model <Inference/OpenVINO/openvino_inference.html>`_
* `How to run asynchronous inference on OpenVINO model <Inference/OpenVINO/openvino_inference_async.html>`_
.. toctree::
:maxdepth: 1
:hidden:
Inference/OpenVINO/openvino_inference
Inference/OpenVINO/openvino_inference_async
PyTorch
~~~~~~~~~~~~~~~~~~~~~~~~~