Reasoning models were as big of an improvement as the Transformer, at least on some benchmarks| epochai.substack.com
Progress in pretrained language model performance outpaces expectations, occurring at a pace equivalent to doubling computational power every 5 to 14 months.| Epoch AI
In recent months, the CEOs of leading AI companies have grown increasingly confident about rapid progress: OpenAI's Sam Altman: Shifted from saying in November "the rate of progress continues" to declaring in January "we are now confident we know how to build AGI" Anthropic's Dario Amodei: Stated in January "I'm more confident than I've ever been that we're close to powerful capabilities... in the next 2-3 years" Google DeepMind's Demis Hassabis: Changed from "as soon as 10 years" in autumn t...| 80,000 Hours
How inference scaling should change US AI strategy| www.chinatalk.media
We investigate four constraints to scaling AI training: power, chip manufacturing, data, and latency. We predict 2e29 FLOP runs will be feasible by 2030.| Epoch AI
Progress in language model performance surpasses what we’d expect from merely increasing computing resources, occurring at a pace equivalent to doubling computational power every 5 to 14 months.| Epoch AI