Access trust and safety resources and the latest news and emerging threats to online child safety. Plus, Safer by Thorn product updates.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Explore new data on online child sexual exploitation and the evolving risks youth face.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Deepfake technology is evolving at an alarming rate, lowering the barrier for bad actors to create hyper-realistic explicit images in seconds—with no technical expertise required. For trust and safety teams, this presents an urgent challenge: AI-generated deepfake nudes are accelerating the spread of nonconsensual image abuse, reshaping how young| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Online child sexual abuse and exploitation has continued to rise for more than a decade. Understand the scale of the issue and the emerging trends to watch for.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
CSAM has distinct characteristics that call for purpose-built solutions, such as CSAM classifiers. Thorn's data science team dives into the details.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Know the signs that could indicate your platform is at risk of hosting CSAM images and videos. Understand what types of sites are most at risk.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
A glossary of special terms and acronyms used within the child safety ecosystem. What's CSAM? CSEA? CSAI? Do they refer to different harms?| Purpose-Built Trust and Safety Solutions | Safer by Thorn
There are several child safety organizations that make their CSAM hash lists available to select partners via sharing programs. Learn more.| Purpose-Built Trust and Safety Solutions | Safer by Thorn