And of course, in the end, LLMs are themselves an example of this: something new, shiny and capable, but nonetheless something that doesn't fit into systems or meet real needs. They're a beautifully forged sword that was presented to a society that didn't need swords, and is now being jammed into random pieces of machinery because after all "it's an important part and if we don't use them in our machines, we'll be left behind". Of course, all that happens is that the machinery breaks.