Curtis Huebner is the head of Alignment at EleutherAI. In this episode we discuss the massive orders of H100s from different actors, why he thinks AGI is 4-5 years away, why he thinks the probability of an AI extinction is around 90%, his comment on Eliezer Yudkwosky’s Death with Dignity, and what kind of Alignment projects is currently going on at EleutherAI, especially a project with Markov chains and the Alignment Minetestproject that he is currently leading.