(Coauthored with others on the alignment team and cross-posted from the alignment forum: part 1, part 2) A sharp left turn (SLT) is a possible rapid increase in AI system capabilities (such as planning and world modeling) that could result in alignment methods no longer working. This post aims to make the sharp left turn scenario […]