Conversations about kids and digital safety are often clouded by moral panic and oversimplification. By exploring the trade-offs of digital devices and platforms, we can empower parents and schools to make decisions that prioritise both safety and connection for young people.| Everything in Moderation*
All you need to know about content moderation and the policies, products, platforms and people shaping its future| Everything in Moderation*
The 2018 standards set the benchmark for moderation transparency and were adopted by the world’s biggest platforms. But with recommendation algorithms and AI now shaping online speech before it’s even published, the Principles may need updating.| Everything in Moderation*
The week in content moderation - edition #302| Everything in Moderation*
Trust & Safety work rarely has a single focus. From AI to regulation to team well-being, the data suggests T&S workers are spinning dozens of plates at once. But is this a problem to fix, or simply the nature of the job?| Everything in Moderation*
The week in content moderation - edition #301| Everything in Moderation*
As AI transforms how Trust & Safety work gets done, it’s also quietly dismantling the career paths that got many of us here. This week, a T&S Insider reader asks what to do when all the entry-level roles disappear.| Everything in Moderation*
The week in content moderation - edition #300| Everything in Moderation*
When critics misunderstand Trust & Safety, it’s not just frustrating — it’s blocks the path to a healthier internet. Here’s what we need from allies and other stakeholders, and what real collaboration with T&S looks like in practice.| Everything in Moderation*
The week in content moderation - edition #299| Everything in Moderation*
The week in content moderation - edition #298| Everything in Moderation*
The Trust & Safety community’s biggest moment of the year is here. From merch to must-see panels, here's what you need-to-know.| Everything in Moderation*
Many T&S professionals stay silent about their work — and that's a problem. Ben and Alice explore why more folks should speak up, what’s holding them back, and how to share safely and strategically.| Everything in Moderation*
The week in content moderation - edition #297| Everything in Moderation*
Fractional T&S leadership sounds good on paper. But if safety is the product, is part-time influence ever enough? Here's my response to a question from a T&S Insider reader| Everything in Moderation*
The week in content moderation - edition #296| Everything in Moderation*
Meta’s refusal to follow the Board’s recommendations on LGBTQ+ hate speech could be the beginning of the end for this much-debated platform accountability experiment| Everything in Moderation*
The week in content moderation - edition #295| Everything in Moderation*
I recently learned that even thoughtful, well-intentioned schools often lack strong safeguards on internet-connected devices. Here’s a list of questions to ask — and a template you can use to start the conversation| Everything in Moderation*
Large Language Models are being tested for everything from transparency to content review. But could they help modernise one of the oldest T&S processes — how users report harm and appeal moderation decisions?| Everything in Moderation*
Despite their ubiquitous use, user reports don't always drive effective moderation or meaningful change in platform policy. Is there a better approach?| Everything in Moderation*
As the Trust & Safety industry matures, we're seeing new types of role emerge that didn't exist five years ago. For each of them, a working knowledge of AI is the bare minimum.| Everything in Moderation*
New research on wellness programs for moderators shows we’re still far from ensuring that the people doing this emotionally demanding work are truly supported.| Everything in Moderation*
The week in content moderation - edition #292| Everything in Moderation*
Newcomers to the Trust & Safety world often ask me what's it like to work in the industry and the things I wish I'd know before I started. So here are my ten hard-won lessons for the next generation of online safety professionals| Everything in Moderation*
Sometimes we have to invest in Trust & Safety because it’s the right thing to do, not because there will be a return on the investment. Here are some suggestions for alternatives to traditional ROI calculations.| Everything in Moderation*
The week in content moderation - edition #291| Everything in Moderation*
Fewer users doesn’t mean fewer risks — bad actors thrive when harm is concentrated among smaller, more active audiences. Platforms must move beyond user reports to stay ahead.| Everything in Moderation*
The week in content moderation - edition #291| Everything in Moderation*
Throughout my career, I’ve struggled with the problem of how to enforce policies in a fair, accurate, and scalable way. A new research paper reminds us just how difficult that is| Everything in Moderation*
The week in content moderation - edition #290| Everything in Moderation*
Most T&S professionals—whether they admit it or not—have a line they won’t cross for their company. But when you're in the middle of a major, public failure, it can be hard to know what to do. Here’s my take on what to consider before quitting.| Everything in Moderation*
The week in content moderation - edition #289| Everything in Moderation*
Everything in Moderation* is your global guide to the forces shaping the future of online speech and the internet. Every week, it brings you need-to-know news and analysis about platform policy, content moderation and internet regulation. There are two EiM editions a week: * Trust & Safety Insider - Alice Hunsberger shares| Everything in Moderation*
A Trust & Safety crisis doesn’t just risk reputational damage — it grabs leadership’s attention too. That’s your moment to make the case for investment in your team. Here’s how I would turn a T&S incident into a strategic win.| Everything in Moderation*
A new report argues that, without industry-wide standards or codes of practice, T&S professionals are vulnerable to corporate pressures and destined to always be reactive to company's conflicting priorities. The answer? Greater independence.| Everything in Moderation*
Much of the work I've done over the years has circled around the problem of how to balance human moderation and automation. Now I'll be doing it within an AI moderation company| Everything in Moderation*
By being more explicit about what values are important (and why), platforms can make it easier for users to decide if it's the place for them| Everything in Moderation*
The week in content moderation - edition #236| Everything in Moderation*