Nano How-to Guides ========================= .. note:: This page is still a work in progress. We are adding more guides. In Nano How-to Guides, you could expect to find multiple task-oriented, bite-sized, and executable examples. These examples will show you various tasks that BigDL-Nano could help you accomplish smoothly. PyTorch Inference ------------------------- * `How to accelerate a PyTorch inference pipeline through ONNXRuntime `_ * `How to accelerate a PyTorch inference pipeline through OpenVINO `_ * `How to quantize your PyTorch model for inference using Intel Neural Compressor `_ * `How to quantize your PyTorch model for inference using OpenVINO Post-training Optimization Tools `_ .. toctree:: :maxdepth: 1 :hidden: accelerate_pytorch_inference_onnx accelerate_pytorch_inference_openvino quantize_pytorch_inference_inc quantize_pytorch_inference_pot Install ------------------------- * `How to install BigDL-Nano in Google Colab `_ .. toctree:: :maxdepth: 1 :hidden: install_in_colab