When critics misunderstand Trust & Safety, it’s not just frustrating — it’s blocks the path to a healthier internet. Here’s what we need from allies and other stakeholders, and what real collaboration with T&S looks like in practice.| Everything in Moderation*
Many T&S professionals stay silent about their work — and that's a problem. Ben and Alice explore why more folks should speak up, what’s holding them back, and how to share safely and strategically.| Everything in Moderation*
New research on wellness programs for moderators shows we’re still far from ensuring that the people doing this emotionally demanding work are truly supported.| Everything in Moderation*
Fewer users doesn’t mean fewer risks — bad actors thrive when harm is concentrated among smaller, more active audiences. Platforms must move beyond user reports to stay ahead.| Everything in Moderation*
Throughout my career, I’ve struggled with the problem of how to enforce policies in a fair, accurate, and scalable way. A new research paper reminds us just how difficult that is| Everything in Moderation*
A new report argues that, without industry-wide standards or codes of practice, T&S professionals are vulnerable to corporate pressures and destined to always be reactive to company's conflicting priorities. The answer? Greater independence.| Everything in Moderation*
Much of the work I've done over the years has circled around the problem of how to balance human moderation and automation. Now I'll be doing it within an AI moderation company| Everything in Moderation*
By being more explicit about what values are important (and why), platforms can make it easier for users to decide if it's the place for them| Everything in Moderation*