We share the discovery of 2.2 million new crystals – equivalent to nearly 800 years’ worth of knowledge. We introduce Graph Networks for Materials Exploration (GNoME), our new deep learning tool...| Google DeepMind
The AI Scientist Generates its First Peer-Reviewed Scientific Publication| sakana.ai
In this Gradient Updates weekly issue, Ege discusses the case for multi-decade AI timelines.| Epoch AI
Our ML trends dashboard showcases key statistics on the trajectory of artificial intelligence, including compute, costs, data, hardware and more.| Epoch AI
Our new article explores whether deployment of advanced AI systems could lead to growth rates ten times higher than those of today’s frontier economies.| Epoch AI
It’s easy to dismiss alarming AI-related predictions when you don’t know where the numbers came from.| 80,000 Hours
To understand how close we are to transformative AI, here’s the metric I find most interesting right now: how long are the tasks AI can do?| Benjamin Todd
Continual learning is a huge bottleneck| www.dwarkesh.com
The economy will literally double every year afterwards| Dwarkesh Podcast
This episode explores the groundbreaking advancements in AGI from recent releases of two Chinese reasoning models: DeepSeek's R1 and Moonshot AI's Kimmy. Watch Episode Here Read Episode Description This episode explores the groundbreaking advancements in AGI from recent releases of two Chinese reasoning models: DeepSeek's R1 and Moonshot AI's Kimmy.| The Cognitive Revolution
Available evidence suggests that rapid growth in reasoning training can continue for a year or so.| Epoch AI
AI companies are increasingly using AI systems to accelerate AI research and development. Today’s AI systems help researchers write code, analyze research papers, and generate training data. Future systems could be significantly more capable – potentially automating the entire AI development cycle from formulating research questions and designing experiments to implementing, testing, and refining new AI systems. We argue that such systems could trigger a runaway feedback loop in which the...| Forethought
Progress in pretrained language model performance outpaces expectations, occurring at a pace equivalent to doubling computational power every 5 to 14 months.| Epoch AI
I'm writing a new guide to careers to help artificial general intelligence (AGI) go well. Here's a summary of the bottom lines that'll be in the guide as it stands. Stay tuned to hear our full reasoning and updates as our views evolve. In short: The chance of an AGI-driven technological explosion before 2030 — creating one of the most pivotal periods in history — is high enough to act on.| 80,000 Hours
AI’s biggest impact will come from broad labor automation—not R&D—driving economic growth through scale, not scientific breakthroughs.| Epoch AI
The 20th century saw unprecedented change: nuclear weapons, satellites, the rise and fall of communism, third-wave feminism, the internet, postmodernism, game theory, genetic engineering, the Big Bang theory, quantum mechanics, birth control, and more. Now imagine all of it compressed into just 10 years.| 80,000 Hours
AI’s “train-once-deploy-many” advantage yields increasing returns: doubling compute more than doubles output by increasing models’ inference efficiency and enabling more deployed inference instances.| Epoch AI
The problem of criticising AI using outdated models| benjamintodd.substack.com
While scaling compute is key to improving LLMs, post-training enhancements can offer gains equivalent to 5-20x more compute at less than 1% of the cost.| Epoch AI
FrontierMath is a benchmark of hundreds of unpublished and extremely challenging math problems to help us to understand the limits of artificial intelligence.| Epoch AI
Q&A with the Google AI head and Nobel laureate on the state of artificial intelligence today, and where it's heading.| www.bigtechnology.com
The second birthday of ChatGPT was only a little over a month ago, and now we have transitioned into the next paradigm of models that can do complex reasoning. New years get people in a reflective...| Sam Altman
Data movement bottlenecks limit LLM scaling beyond 2e28 FLOP, with a “latency wall” at 2e31 FLOP. We may hit these in ~3 years. Aggressive batch size scaling could potentially overcome these limits.| Epoch AI
We investigate four constraints to scaling AI training: power, chip manufacturing, data, and latency. We predict 2e29 FLOP runs will be feasible by 2030.| Epoch AI
If trends continue, language models will fully utilize the stock of human-generated public text between 2026 and 2032.| Epoch AI
The Nobel Prize in Chemistry 2024 was divided, one half awarded to David Baker "for computational protein design", the other half jointly to Demis Hassabis and John Jumper "for protein structure prediction"| NobelPrize.org
Our state-of-the-art model delivers 10-day weather predictions at unprecedented accuracy in under one minute| Google DeepMind
[This post was up a few weeks ago before getting taken down for complicated reasons. They have been sorted out and I’m trying again.] Is scientific progress slowing down? I recently got a cha…| Slate Star Codex