[Nano] : Add howto guides for InferenceOptimizer.optimize (#6136)
				
					
				
			* howto guide for InferenceOptimizer * fix format in notebook * rename notebook & add github workflow * fix doc issue * fix notebook * fix typo * remove ipykernel * update notebook * adapt new theme * fix typo & remove necessary numpy
This commit is contained in:
		
							parent
							
								
									5ed2a33072
								
							
						
					
					
						commit
						a0fa1694b5
					
				
					 2 changed files with 5 additions and 0 deletions
				
			
		| 
						 | 
				
			
			@ -0,0 +1,3 @@
 | 
			
		|||
{
 | 
			
		||||
    "path": "../../../../../../../../python/nano/tutorial/notebook/inference/pytorch/inference_optimizer_optimize.ipynb"
 | 
			
		||||
}
 | 
			
		||||
| 
						 | 
				
			
			@ -62,6 +62,7 @@ PyTorch
 | 
			
		|||
* `How to accelerate a PyTorch inference pipeline through OpenVINO <Inference/PyTorch/accelerate_pytorch_inference_openvino.html>`_
 | 
			
		||||
* `How to quantize your PyTorch model for inference using Intel Neural Compressor <Inference/PyTorch/quantize_pytorch_inference_inc.html>`_
 | 
			
		||||
* `How to quantize your PyTorch model for inference using OpenVINO Post-training Optimization Tools <Inference/PyTorch/quantize_pytorch_inference_pot.html>`_
 | 
			
		||||
* `How to find accelerated method with minimal latency using InferenceOptimizer <Inference/PyTorch/inference_optimizer_optimize.html>`_
 | 
			
		||||
 | 
			
		||||
.. toctree::
 | 
			
		||||
    :maxdepth: 1
 | 
			
		||||
| 
						 | 
				
			
			@ -71,6 +72,7 @@ PyTorch
 | 
			
		|||
    Inference/PyTorch/accelerate_pytorch_inference_openvino
 | 
			
		||||
    Inference/PyTorch/quantize_pytorch_inference_inc
 | 
			
		||||
    Inference/PyTorch/quantize_pytorch_inference_pot
 | 
			
		||||
    Inference/PyTorch/inference_optimizer_optimize
 | 
			
		||||
 | 
			
		||||
Install
 | 
			
		||||
-------------------------
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
		Loading…
	
		Reference in a new issue