Login
From:
Yuxi on the Wired
(Uncensored)
subscribe
Predicting AGI by the Turing Test – Yuxi on the Wired
https://yuxi-liu-wired.github.io/essays/posts/perplexity-turing-test/
links
backlinks
Tagged with:
math
ai
scaling
Roast topics
Find topics
Find it!
Minimizing log-perplexity loss is equivalent to maximizing survival length in a Turing test. Assuming compute-loss scaling law, a scaled-up GPT that produces human-like science papers would cost ~200 years of global GDP.