[Nano] Openvino model inference notebook example with Nano (#5745)
* add nano notebook example for openvino ir * add basic example for openvino model inference * add notebook example for sync inference and async inference * add notebook to documentation * update explanation for async api * try to fix code snip * fix code snip * simplify async api explanation * simplify async api explanation * adapt new theme
This commit is contained in:
parent
aff0f55b7f
commit
607db04ad7
3 changed files with 19 additions and 0 deletions
|
|
@ -0,0 +1,3 @@
|
||||||
|
{
|
||||||
|
"path": "../../../../../../../../python/nano/tutorial/notebook/inference/openvino/openvino_inference.ipynb"
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
{
|
||||||
|
"path": "../../../../../../../../python/nano/tutorial/notebook/inference/openvino/openvino_inference_async.ipynb"
|
||||||
|
}
|
||||||
|
|
@ -31,6 +31,19 @@ General
|
||||||
Inference Optimization
|
Inference Optimization
|
||||||
-------------------------
|
-------------------------
|
||||||
|
|
||||||
|
OpenVINO
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
* `How to run inference on OpenVINO model <Inference/OpenVINO/openvino_inference.html>`_
|
||||||
|
* `How to run asynchronous inference on OpenVINO model <Inference/OpenVINO/openvino_inference_async.html>`_
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:hidden:
|
||||||
|
|
||||||
|
Inference/OpenVINO/openvino_inference
|
||||||
|
Inference/OpenVINO/openvino_inference_async
|
||||||
|
|
||||||
PyTorch
|
PyTorch
|
||||||
~~~~~~~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue