We investigate four constraints to scaling AI training: power, chip manufacturing, data, and latency. We predict 2e29 FLOP runs will be feasible by 2030.| Epoch AI
If trends continue, language models will fully utilize the stock of human-generated public text between 2026 and 2032.| Epoch AI
OHGOOD: A coordination body for compute governance| adamjones.me
We estimate the stock of human-generated public text at around 300 trillion tokens. If trends continue, language models will fully utilize this stock between 2026 and 2032, or even earlier if intensely overtrained.| Epoch AI