NVIDIA NIM is a great way to run AI inference workloads in containers. I deploy primarily to Kubernetes, so I wanted to dig into deploying NIM using the Kubernetes NIM Operator and use GPUs in Google Cloud. I actually started by going to ChatGPT and asked it to give me a step by step guide for doing this on GKE. The results it gave seemed impressive, until I started following the steps. ChatGPT is good at a lot of things, but in this case it gave me complete and utter nonsense. So I thought I...