I’m splitting time between two roles at work now and one of the roles has a heavy focus on LLMs. Much like many of you, I’ve given ChatGPT a try with questions from time to time. I’ve also used GitHub Copilot within Visual Studio Code. They’re all great, but I was really hoping to run something locally on my machine at home. Then I stumbled upon a great post on All Things Open titled “Build a local AI co-pilot using IBM Granite Code, Ollama, and Continue” that started me down a pa...