Just simply, go to the link and select the onnx model you want to visualize.
When converting your model, for example,
TENSORFLOW1 ⇒ tensorRT engine
You may want to check your input/output dimensions in case something unexpected is happening when using the tensorRT engine for inference.