When responding to war on terror prompts, ChatGPT bot made all the wrong decisions about torture and racial profiling.| The Intercept