79. Generate TensorRT Engine with Unet

Here is 1 way to Generate an engine file with Unet

Pre-requisites
– Deepstream 5.0.1
– TensorRT 7.1.3
– Jetson XAVIER NX
– CUDA 10.2

  1. Load Model
    import torch
    unet = torch.hub.load('milesial/Pytorch-UNet', 'unet_carvana', pretrained=True)
    unet.eval()
    
  2. Export Model
    path_to_onnx ="/PATH/TO/pytorch_unet_512x512.onnx"
    dummy = torch.randn(1, 3, 512, 512)
    torch.onnx.export(model=unet, args=dummy, f=path_to_onnx, opset_version=11)
    
  3. Simplify ONNX Model
    onnxsim /PATH/TO/pytorch_unet_512x512.onnx /PATH/TO/pytorch_unet_512x512_sim.onnx --input-shape 1,3,512,512
    
  4. Copy Model To Jetson

  5. Generate Engine

    /usr/src/tensorrt/bin/trtexec --onnx=pytorch_unet_512x512_sim.onnx  --explicitBatch --saveEngine=pytorch_unet_512x512_sim.engine --workspace=5000