This Gradient Updates issue explores DeepSeek-R1’s architecture, training cost, and pricing, showing how it rivals OpenAI’s o1 at 30x lower cost.| Epoch AI
This Gradient Updates issue explains Moravec’s paradox and offers a speculative picture of how hard various economic tasks are to automate based on the paradox.| Epoch AI
Available evidence suggests that rapid growth in reasoning training can continue for a year or so.| Epoch AI
This Gradient Updates issue explores how AGI could disrupt labor markets, potentially driving wages below subsistence levels, and challenge historical economic trends.| Epoch AI
This Gradient Updates issue explores how much energy ChatGPT uses per query, revealing it’s 10x less than common estimates.| Epoch AI
AI’s biggest impact will come from broad labor automation—not R&D—driving economic growth through scale, not scientific breakthroughs.| Epoch AI
This Gradient Updates issue goes over the major changes that went into DeepSeek’s most recent model.| Epoch AI