Discover why a responsible AI framework like Safety by Design must be the standard for innovation in generative AI technology.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Built by experts in child safety technology, Safer helps protect platforms and their users by providing industry-leading tools for proactive CSAM detection.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Access trust and safety resources and the latest news and emerging threats to online child safety. Plus, Safer by Thorn product updates.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Information for trust and safety professionals about different types of online child sexual perpetrators.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Detecting CSAM within video content presents unique challenges. To solve this, Thorn's engineers developed a proprietary hashing technology called SSVH.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
The REPORT Act is now federal law. We provide details about its components and explain how it will impact online platforms.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
New research reveals the state of sextortion in 2025, with specific insights relevant to platform trust and safety teams.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
In 2020, VSCO added Safer into its Trust & Safety team’s workflow, making it possible to now proactively scan all uploaded content for CSAM.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Expert-backed CSAM detection and CSE solutions, including classifiers for image, video, and text-based harms against children. See the Safer difference.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
In 2024, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM and CSE on their platforms.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Explore new data on online child sexual exploitation and the evolving risks youth face.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Find out why content moderation matters for every platform, and get a step-by-step approach to finding your content moderation solution.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Deepfake technology is evolving at an alarming rate, lowering the barrier for bad actors to create hyper-realistic explicit images in seconds—with no technical expertise required. For trust and safety teams, this presents an urgent challenge: AI-generated deepfake nudes are accelerating the spread of nonconsensual image abuse, reshaping how young| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Addressing the risks of nonconsensual image abuse AI-generated deepfake nudes are accelerating the spread of nonconsensual image abuse, making it easier for bad actors to manipulate and weaponize imagery of children and adults alike. Our latest research at Thorn found that 31% of teens are already familiar with deepfake nudes,| Purpose-Built Trust and Safety Solutions | Safer by Thorn
In 2023, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
GIPHY proactively detects CSAM with Safer to deliver on its promise of being a source for content that makes conversations more positive.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Everest Group has recognized Safer by Thorn as one of their content moderation technology trailblazers. Read why our purpose-built solutions made the short list.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Online child sexual abuse and exploitation has continued to rise for more than a decade. Understand the scale of the issue and the emerging trends to watch for.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
This 7-part series is an essential guide to understanding CSAM and the available detection strategies leveraged by trust and safety professionals.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Explore Thorn's predictions for online child safety in 2025, from regulatory changes to AI challenges, and how these trends will shape the future of trust and safety.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
As Bluesky welcomes 7M+ new users, Thorn's Safer technology helps protect its growing community from CSAM. Learn how safety by design enables responsible platform growth.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Content moderation is crucial for business growth. Trust & Safety initiatives protect users, boost engagement, and improve company value.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Match.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
John Starr puts several trust and safety leaders in the “hot spot” for a lightning round of questions to get their POV on what it’s like to work in trust and safety.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Discover the human side of trust and safety in this candid conversation between Patricia Cartes, Head of Trust & Safety at Cantina AI, and host John Starr, Thorn’s VP of Strategic Impact.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
The CSAM Keyword Hub contains words and phrases related to CSAM and child sexual exploitation, boosting content moderation efforts.| Safer: Proactive Solution for CSE and CSAM Detection
Discover the human side of trust and safety in this candid conversation between Jerrel Peterson, Director of Content Policy at Spotify, and host John Starr, Thorn’s VP of Strategic Impact.| Safer: Proactive Solution for CSE and CSAM Detection
Discover the human side of trust and safety in this candid conversation between Yoel Roth, VP at Match Group, and host John Starr, Thorn’s VP of Strategic Impact.| Safer: Proactive Solution for CSE and CSAM Detection
Learn key insights from Thorn’s beta period and explore how our text classifier can help platforms proactively detect child sexual exploitation.| Safer: Proactive Solution for CSE and CSAM Detection
Our latest solution, Safer Predict offers a cutting-edge AI-driven solution that detects new and unreported CSAM as well as harmful text conversations.| Safer: Proactive Solution for CSE and CSAM Detection
Teens speak candidly about their experiences using in-platform safety tools, and how platforms can improve these crucial features to better serve kids.| Safer: Proactive Solution for CSE and CSAM Detection
4 considerations for Trust and Safety teams at digital platforms as they review their child safety policies. Here’s what to consider.| Safer: Proactive Solution for CSE and CSAM Detection
Generative AI is being misused to conduct child sexual abuse. Digital platforms need to understand the risks this threat poses to their platforms, users and children.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."| Purpose-Built Trust and Safety Solutions | Safer by Thorn
CSAM has distinct characteristics that call for purpose-built solutions, such as CSAM classifiers. Thorn's data science team dives into the details.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Know the signs that could indicate your platform is at risk of hosting CSAM images and videos. Understand what types of sites are most at risk.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
A glossary of special terms and acronyms used within the child safety ecosystem. What's CSAM? CSEA? CSAI? Do they refer to different harms?| Purpose-Built Trust and Safety Solutions | Safer by Thorn
There are several child safety organizations that make their CSAM hash lists available to select partners via sharing programs. Learn more.| Purpose-Built Trust and Safety Solutions | Safer by Thorn
Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.| Purpose-Built Trust and Safety Solutions | Safer by Thorn