Between 2012 and 2018, the amount of computing power used by record-breaking artificial intelligence models doubled every 3.4 months. Even with money pouring into the AI field, this trendline is unsustainable. Because of cost, hardware availability and engineering difficulties, the next decade of AI can't rely exclusively on applying more and more computing power to drive further progress.| Center for Security and Emerging Technology
How much did Google's 530B parameter model PaLM's training cost? Something around $9M to $23M.| Blog - Lennart Heim