Transformers take static vector embeddings, which assign single values to every token, and expand their context, nearly simultaneously as they process the context of every other word in the sentence. But who cares, let's listen to a pop song!| The Content Technologist
How to understand tokens and vector embeddings, for word people.| The Content Technologist
Even in the face of "black box" algorithms, the history of artificial intelligence—natural language processing, more specifically—has left plenty of clues. While we can't understand the full equation, we can see how building blocks create common patterns in how current algorithms process language.| The Content Technologist
Infrastructure, no matter how solid or planned or continuously improved, gets messy. Observing humanity in July feels like the ultimate UX research experiment.| The Content Technologist
July is prime chart-making season. July 2025 is one more chartmaking season that will not be powered by artificial intelligence, and here's why.| The Content Technologist
Are attractive websites more likely to trick experienced content strategists?| The Content Technologist
The Content Technologist gets a brand new dress and (finally!) adds commenting on posts| The Content Technologist
Don't ever tell anybody anything. If you do, you'll end up missing everybody.| The Content Technologist