Since 2010, the length of training runs has increased by 1.2x per year among notable models, excluding those that are fine-tuned from base models. A continuation of this trend would ease hardware constraints, by increasing training compute without requiring more chips or power. However, longer training times face a tradeoff. For very long runs, waiting for future improvements to algorithms and hardware might outweigh the benefits of extended training.| Epoch AI
AI supercomputers double in performance every 9 months, cost billions of dollars, and require as much power as mid-sized cities. Companies now own 80% of all AI supercomputers, while governments’ share has declined.| Epoch AI
Progress in pretrained language model performance outpaces expectations, occurring at a pace equivalent to doubling computational power every 5 to 14 months.| Epoch AI
Why do we need to regulate the use of Artificial Intelligence?\nThe EU AI Act is the world\'s first comprehensive AI law. It aims to address risks to health, safety and fundamental rights. The regulatio| European Commission - European Commission
We investigate four constraints to scaling AI training: power, chip manufacturing, data, and latency. We predict 2e29 FLOP runs will be feasible by 2030.| Epoch AI
If trends continue, language models will fully utilize the stock of human-generated public text between 2026 and 2032.| Epoch AI