From 18ae409b7cb16db700cd99122d8a76f8ef2e8996 Mon Sep 17 00:00:00 2001 From: Henry Ma <58333343+HensonMa@users.noreply.github.com> Date: Wed, 4 Jan 2023 23:43:48 +0800 Subject: [PATCH] [Nano] add how-to-guide for tensorflow inference by onnxruntime and openvino (#7149) * Feat(docs): add how-to-guide for tensorflow inference by onnxruntime and openvino * fix bugs for index.rst * revise according to PR comments * revise minor parts according to PR comments * revise bugs according to PR comments --- docs/readthedocs/source/_toc.yml | 2 ++ .../TensorFlow/accelerate_tensorflow_inference_onnx.nblink | 3 +++ .../accelerate_tensorflow_inference_openvino.nblink | 3 +++ docs/readthedocs/source/doc/Nano/Howto/index.rst | 5 +++++ 4 files changed, 13 insertions(+) create mode 100644 docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_onnx.nblink create mode 100644 docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_openvino.nblink diff --git a/docs/readthedocs/source/_toc.yml b/docs/readthedocs/source/_toc.yml index d7565bed..2f79ebf4 100644 --- a/docs/readthedocs/source/_toc.yml +++ b/docs/readthedocs/source/_toc.yml @@ -110,6 +110,8 @@ subtrees: - file: doc/Nano/Howto/Training/PyTorch/use_nano_decorator_pytorch_training - file: doc/Nano/Howto/Training/TensorFlow/accelerate_tensorflow_training_multi_instance - file: doc/Nano/Howto/Training/TensorFlow/tensorflow_training_embedding_sparseadam + - file: doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_onnx + - file: doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_openvino - file: doc/Nano/Howto/Training/General/choose_num_processes_training - file: doc/Nano/Howto/Inference/OpenVINO/openvino_inference - file: doc/Nano/Howto/Inference/OpenVINO/openvino_inference_async diff --git a/docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_onnx.nblink b/docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_onnx.nblink new file mode 100644 index 00000000..443efe0d --- /dev/null +++ b/docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_onnx.nblink @@ -0,0 +1,3 @@ +{ + "path": "../../../../../../../../python/nano/tutorial/notebook/inference/tensorflow/accelerate_tensorflow_inference_onnx.ipynb" +} \ No newline at end of file diff --git a/docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_openvino.nblink b/docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_openvino.nblink new file mode 100644 index 00000000..275ffe1b --- /dev/null +++ b/docs/readthedocs/source/doc/Nano/Howto/Inference/TensorFlow/accelerate_tensorflow_inference_openvino.nblink @@ -0,0 +1,3 @@ +{ + "path": "../../../../../../../../python/nano/tutorial/notebook/inference/tensorflow/accelerate_tensorflow_inference_openvino.ipynb" +} \ No newline at end of file diff --git a/docs/readthedocs/source/doc/Nano/Howto/index.rst b/docs/readthedocs/source/doc/Nano/Howto/index.rst index 5aee1313..2b8f9db3 100644 --- a/docs/readthedocs/source/doc/Nano/Howto/index.rst +++ b/docs/readthedocs/source/doc/Nano/Howto/index.rst @@ -66,6 +66,11 @@ PyTorch .. |pytorch_inference_context_manager_link| replace:: How to use context manager through ``get_context`` .. _pytorch_inference_context_manager_link: Inference/PyTorch/pytorch_context_manager.html +TensorFlow +~~~~~~~~~~~~~~~~~~~~~~~~~ +* `How to accelerate a TensorFlow inference pipeline through ONNXRuntime `_ +* `How to accelerate a TensorFlow inference pipeline through OpenVINO `_ + Install ------------------------- * `How to install BigDL-Nano in Google Colab `_