Login
Roast topics
Find topics
Find it!
From:
Simon Willison’s Weblog
(Uncensored)
subscribe
Hallucinations in code are the least dangerous form of LLM mistakes
https://simonwillison.net/2025/Mar/2/hallucinations-in-code/
links
backlinks
Roast topics
Find topics
Roast it!
A surprisingly common complaint I see from developers who have tried using LLMs for code is that they encountered a hallucination—usually the LLM inventing a method or even a full …