Of all the AI buzzwords out there, the word “model” would seem free of hyperbole compared to “superintelligent” or the “singularity.” Yet this innocuous-seeming word can mean two contradictory things, and AI companies are deliberately muddling the line between them. A recent Harvard/MIT study of simulating planetary orbits illustrates the contrast between what scientists consider a “world model” and what AI enthusiasts think transformers are generating.| Still Water Lab
When he died last Wednesday, artist Mel Bochner left a body of work that’s gained in relevance in a half century defined by information—even more so in the age of AI.| Still Water Lab
Most of us know by now that generative AI can promote stereotypes based on biased data. Yet even when the training data is saturated with perfectly accurate representations—and little to no inaccurate ones—the results can still be biased. Why is this so? To demonstrate this, I prompted an image generator for two of the most … AI, Old Masters, and the Geometry of Misinformation Read More »| Still Water Lab
What happens to ground truth when finding a factoid or photo no longer means consulting an archive but generating one from scratch? That’s the question that drives “Honey, AI Shrunk the Archive,” an essay I wrote for the forthcoming anthology New Directions in Digital Textual Studies.| Still Water Lab
Diversity without cacophony Group discussions on controversial subjects can open students to more viewpoints, but they can also result in the usual suspects—sometimes the most thoughtful students, but often just the loudmouths—dominating the conversation. So I was intrigued when Greg Nelson and Rotem Landesman, my collaborators on a course that examines in-depth the impact of … Pooling ideas for an AI ethics policy Read More »| Still Water Lab
When a tech reporter for the NY Times outsourced her decisions for a week to ChatGPT, she complained that “AI made me basic.” But it turns out the math behind generative AI can lead to results that are blandly average or wildly inaccurate.| Still Water Lab