[Nano] How-to Guide: Inference via Openvino on Intel GPUs (#7212)

* add openvino gpu inference guide

* enable online doc for how-to

* fix

* fix layout error

* update w.r.t. comments

* fix

* fix

* fix error
This commit is contained in:
Sirui Tao 2023-01-12 09:31:12 +08:00 committed by GitHub
parent cc6f9b4dd2
commit 3543a58723
3 changed files with 5 additions and 0 deletions

View file

@ -118,6 +118,7 @@ subtrees:
- file: doc/Nano/Howto/Training/General/choose_num_processes_training - file: doc/Nano/Howto/Training/General/choose_num_processes_training
- file: doc/Nano/Howto/Inference/OpenVINO/openvino_inference - file: doc/Nano/Howto/Inference/OpenVINO/openvino_inference
- file: doc/Nano/Howto/Inference/OpenVINO/openvino_inference_async - file: doc/Nano/Howto/Inference/OpenVINO/openvino_inference_async
- file: doc/Nano/Howto/Inference/OpenVINO/accelerate_inference_openvino_gpu
- file: doc/Nano/Howto/Inference/PyTorch/accelerate_pytorch_inference_onnx - file: doc/Nano/Howto/Inference/PyTorch/accelerate_pytorch_inference_onnx
- file: doc/Nano/Howto/Inference/PyTorch/accelerate_pytorch_inference_openvino - file: doc/Nano/Howto/Inference/PyTorch/accelerate_pytorch_inference_openvino
- file: doc/Nano/Howto/Inference/PyTorch/accelerate_pytorch_inference_jit_ipex - file: doc/Nano/Howto/Inference/PyTorch/accelerate_pytorch_inference_jit_ipex

View file

@ -0,0 +1,3 @@
{
"path": "../../../../../../../../python/nano/tutorial/notebook/inference/openvino/accelerate_inference_openvino_gpu.ipynb"
}

View file

@ -58,6 +58,7 @@ OpenVINO
* `How to run inference on OpenVINO model <Inference/OpenVINO/openvino_inference.html>`_ * `How to run inference on OpenVINO model <Inference/OpenVINO/openvino_inference.html>`_
* `How to run asynchronous inference on OpenVINO model <Inference/OpenVINO/openvino_inference_async.html>`_ * `How to run asynchronous inference on OpenVINO model <Inference/OpenVINO/openvino_inference_async.html>`_
* `How to accelerate a PyTorch / TensorFlow inference pipeline on Intel GPUs through OpenVINO <Inference/OpenVINO/accelerate_inference_openvino_gpu.html>`_
PyTorch PyTorch
~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~