Americans’ perceptions of diversity, equity and inclusion (DEI) are showing signs of decline as they understand the implications for how the term has been weaponized to push radical political agendas. While Americans widely believe everyone, regardless of race, gender or background, should have a chance to succeed, they are increasingly cooling to programs that inject division into workplaces or fan animus between racial groups.