The past few days have seen a back-and-forth between Scott Alexander and Gary Marcus on the topic of AI scaling (post1, post2, post3, post4). Specifically, the debate is whether scaled-up language models in the style of GPT-3 will eventually become general intelligences, or whether we will hit some fundamental limits....| jacobbuckman.com
…and it is called active learning, and it’s not very impressive. The connection is pretty simple to see. Let’s start by outlining what a “recursively self-improving AI” would look like. To start, there’s some code. It gets compiled into an executable function. This is then evaluated, giving some score or...| jacobbuckman.com
The current paradigm will not lead to an overnight superintelligence explosion.| jacobbuckman.substack.com
We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.| Future of Life Institute