NVIDIA TensorRT Inference Server

Model serving with TRT Inference Server

Kubeflow currently doesn’t have a specific guide for NVIDIA TensorRT Inference Server. See the NVIDIA documentation for instructions on running NVIDIA inference server on Kubernetes.