“Any fool can know. The point is to understand.” ― Albert Einstein| www.strangeloopcanon.com
Our impressions of Devin after giving it 20+ tasks.| Answer.AI
Data movement bottlenecks limit LLM scaling beyond 2e28 FLOP, with a “latency wall” at 2e31 FLOP. We may hit these in ~3 years. Aggressive batch size scaling could potentially overcome these limits.| Epoch AI
We investigate four constraints to scaling AI training: power, chip manufacturing, data, and latency. We predict 2e29 FLOP runs will be feasible by 2030.| Epoch AI