Our youngsters are rising up in a digital world, and predators are making the most of this new actuality—exploiting kids in these digital areas and creating new vulnerabilities for them.
Day-after-day, lots of of 1000’s of photographs and movies of kids struggling a number of the worst sexual abuse conceivable are being shared on-line, whereas the kid victims wait to be discovered. Final 12 months alone, greater than 100 information of kid sexual abuse materials had been reported every minute.
New threats like monetary sextortion and deepfake nudes are frequently rising, whereas persistent risks like on-line grooming are accelerating.
A latest Thorn survey discovered that 40% of children aged 9-17 have been approached on-line in an try and befriend and manipulate them. To place it into perspective, that’s roughly 710,000 children within the San Francisco Bay space being focused for grooming.
The size, severity, and protracted progress of this drawback make it clear: baby sexual exploitation and abuse is a world public well being disaster that calls for an pressing, coordinated response.
Our youngsters can’t wait.
This heartbreaking actuality calls for a extra complete strategy to defending kids, which is why we’re introducing our subsequent chapter with a brand new Thorn mission assertion: to rework how kids are shielded from sexual abuse and exploitation within the digital age.
We’re constructing a digital security internet for youngsters—a multi-layered safety technique for the digital world. Whereas playgrounds have been designed with security surfaces and vehicles outfitted with security restraints, the digital areas the place kids now spend a lot of their time stay dangerously unprotected.
Our digital security internet addresses this pressing hole by way of cutting-edge expertise, unique analysis, and collaborative partnerships. It creates a number of layers of safety that work collectively to rework how kids are protected by:
- Creating safer on-line environments and making the web an unwelcoming place for abuse.
- Lowering the time it takes for investigators to search out baby victims and take away them from hurt.
- Empowering the broader baby safety neighborhood with the most recent analysis and technical sources to higher fight abuse.
To perform this transformation, we’ve recognized 4 pillars on which our work will focus going ahead. Collectively, these pillars create an built-in framework for transferring your complete baby security ecosystem ahead.
4 pillars to assist the brand new Thorn mission.
Analysis and Insights
Understanding the quickly altering nature of kid sexual abuse and exploitation is key to our mission. At Thorn, we don’t simply reply to threats—we get forward of them by way of rigorous analysis that facilities on the experiences of younger folks themselves and growing security frameworks to deal with these threats.
With our novel analysis, we’re not simply gathering insights—we’re driving consciousness and motion throughout platforms, NGOs, policymakers, and legislation enforcement businesses worldwide. By mapping digital threats and growing security frameworks for essential applied sciences, we’re making a basis for all our work and accelerating the responsiveness of your complete baby safety ecosystem.
Technical Innovation
Technological advances that create new dangers can be our strongest safety instruments. At Thorn, our world-class machine studying engineers harness the ability of AI to construct detection instruments and security applied sciences that frequently enhance and preserve tempo with rising harms.
This technical innovation is a essential a part of our digital security internet, because it powers our different pillars, offering the important instruments for baby sufferer identification and platform security options. We stay on the forefront of kid safety expertise by sustaining and frequently enhancing our picture, video, and textual content classification capabilities whereas additionally addressing rising threats with new technological options.
Baby Sufferer Identification
Behind each picture or video of abuse is an actual baby ready to be discovered and shielded from hurt. When a baby is in an energetic abuse state of affairs, each second issues—but investigators face a frightening needle-in-a-haystack problem, sifting by way of overwhelming volumes of digital proof to search out the youngsters who need assistance most urgently. In any case, discovering related, actionable info shortly is important for safeguarding a sufferer; whether or not an investigator is sorting by way of tens of millions of photographs and messages on the darkish internet to determine baby victims being actively abused, or triaging the 1000’s of circumstances of their jurisdiction reported by way of NCEMC’s CyberTipline.
Our latest product, Thorn Detect, is a robust instrument in forensic investigations. It helps investigators prioritize essentially the most essential information amid overwhelming case volumes by shortly detecting potential baby sexual abuse photographs and movies. The suggestions we’ve acquired confirms what we’ve all the time believed: after we equip the frontlines with higher expertise, extra kids are discovered sooner and faraway from dangerous conditions. I’m notably proud that our sufferer identification instruments are actually being utilized in 36 international locations worldwide, throughout 700 legislation enforcement businesses, creating a really world impression for youngsters at risk.
Platform Security
Each time a picture or video of kid sexual abuse is shared on-line, it perpetuates trauma for the sufferer. That’s why our platform security pillar is devoted to combating baby sexual abuse and exploitation throughout the open internet by equipping tech platforms with purpose-built options and knowledgeable consulting.
In 2024, our impression on this space reached an unprecedented scale:
- Over 4.1 million suspected CSAM information had been detected utilizing our instruments
- Greater than 112 billion whole information had been processed by way of our detection expertise
- 60+ corporations are actually utilizing Thorn’s CSAM detection merchandise to create safer digital environments
Wanting ahead
This evolution of our mission isn’t nearly responding to right now’s threats—it’s about constructing a sustainable framework that may adapt as expertise advances. The 4 pillars work collectively, every strengthening the others: analysis informs our expertise improvement, which powers sufferer identification and platform security options.
Our dual-income mannequin means three out of the 4 pillars are primarily philanthropically funded. The income from our tech business merchandise helps platform security, whereas analysis and insights, technical innovation, and baby sufferer identification depend on donor assist to achieve our targets.
We have now lofty ambitions for 2025. This contains persevering with to develop our Security by Design initiative for generative AI, releasing new and up to date analysis about youth’s experiences with sextortion and different on-line threats, and persevering with to construct superior detection instruments.
As we pursue our new mission, I invite you to hitch us and assist rework how we shield children within the digital age. Collectively, we will construct a world the place each baby is free to easily be a child.
Thanks in your unwavering dedication to this mission and the youngsters we serve.