We shouldn’t need any illusions to understand how generative tools might be useful. This obsession with anthropomorphization hinders our ability to understand what these systems can and cannot do, leaving us with a confused and muddled idea of their capabilities. An LLM’s ability to predict patterns is impressive and quite useful in many contexts, but that doesn't make it conscious.