Annually, the kid security ecosystem appears to the Nationwide Heart for Lacking & Exploited Kids’s CyberTipline Report for a snapshot of the web exploitation panorama. The info—what number of studies had been made by platforms and the general public, what number of information of suspected youngster sexual abuse had been shared, and what number depicted toddlers versus teenagers—presents one of many few indicators we now have of the dimensions and nature of technology-facilitated abuse.
And yearly, we ask: What do the numbers imply? Are we making progress, or falling behind?
This yr, the reply is sure… to each. We’re making progress. We’re shedding floor. And this stays solely the tip of the iceberg.
One factor we all know for a truth is that the dimensions of abuse remains to be staggering. In 2024, the CyberTipline acquired 20.5 million studies, together with almost 63 million information—photos, movies, and different supplies associated to youngster sexual exploitation.
Every report, every file, every incident displays a toddler who has been harmed. So whereas the numbers could also be decrease than what we noticed in final yr’s knowledge, they continue to be unacceptably excessive—and so they should be addressed via continued vigilance, innovation, and cross-sector collaboration.
The impression of expertise and consciousness
We’re seeing rising proof that each technological innovation and public consciousness are influencing the pipeline of reporting in ways in which enhance detection and prevention, whereas new applied sciences additionally introduce new challenges for youngster security:
- Bundling: One of many notable declines on this yr’s reporting could also be defined by NCMEC’s introduction of report “bundling,” which consolidates duplicate ideas tied to a single viral incident.
- Platform adjustments: Updates like default end-to-end encryption (E2EE) and revised content material insurance policies are doubtless altering what content material is detected and the way it’s reported. These adjustments matter—they replicate evolving approaches to privateness, security, and belief & security design.
- Coverage momentum: The REPORT Act, enacted in 2024, now mandates that platforms report instances of on-line enticement and youngster intercourse trafficking. That coverage shift doubtless contributed to the spike in on-line enticement studies—exhibiting that youngster security laws paired with platform compliance can enhance visibility into particular sorts of hurt.
- Public detection and response:: As we’ve seen with sextortion, public recognition of rising threats can play a pivotal position in surfacing hurt that platforms miss. This yr’s surge in public studies tied to violent on-line teams highlights each a rising willingness to report—and ongoing gaps in detection and disruption by platforms.
Key findings from the 2024 NCMEC CyberTipline Report
A better have a look at this yr’s knowledge reveals a number of essential tendencies and notable shifts within the youngster security panorama:
- 20.5 million studies of suspected youngster sexual exploitation had been submitted to NCMEC in 2024—a 43% lower from the 36.2 million studies in 2023. Nonetheless, when adjusted for incidents (to account for bundled studies), the quantity is 29.2 million distinct incidents, nonetheless reflecting a staggering scale of hurt.
- 62.9 million information had been included in 2024 studies—33.1 million movies, 28 million photos, and almost 2 million different file sorts. These information are proof of abuse, and each one is tied to suspected abuse or exploitation of a kid.
- On-line enticement (crimes involving an grownup speaking with a toddler for sexual functions) studies rose 192%, reaching greater than 546,000 ideas. This dramatic improve is probably going due partially to the brand new REPORT Act, which requires corporations to report on-line enticement and youngster intercourse trafficking for the primary time.
- Experiences involving generative AI surged by 1,325%, climbing from 4,700 in 2023 to 67,000 in 2024. Whereas this stays a small proportion of whole studies, it’s a transparent sign that AI-generated youngster sexual abuse materials (AIG-CSAM) is rising —and calls for proactive security interventions like Security by Design, moral AI improvement, and strong transparency reporting.
- NCMEC additionally noticed greater than 1,300 studies tied to violent on-line teams, representing a 200% improve from 2023. These teams promote sadistic types of abuse, together with self-harm, sibling exploitation, and animal cruelty. Strikingly, 69% of those studies got here from members of the general public — akin to mother and father or caregivers —underscoring a excessive stakes hole in detection by platforms.
Defending the youngsters behind the numbers
No quantity of suspected exploitation studies is appropriate on the earth we would like for our youngsters. 63 million suspected abuse information are far too many information.
Behind every file and report is a toddler—somebody experiencing abuse, coercion, or exploitation. That’s the truth we can not lose sight of.
And whereas adjustments in reporting programs, applied sciences, and insurance policies can all shift the numbers yr over yr, what stays fixed is the pressing want for a better, extra unified response. Decrease numbers don’t essentially imply much less abuse. In some instances, they imply much less visibility into it.
That’s why Thorn continues to champion a broader, extra resilient strategy to youngster security that features issues like:
- Adapting applied sciences and platform design to mitigate dangers from elevated use of E2EE and up to date content material insurance policies, which can impression what’s detectable—alongside a brand new era of expertise corporations stepping as much as proactively deal with these dangers via accountable reporting and intervention.
- Transparency reporting from on-line platforms, serving to all the youngster safety ecosystem and most of the people perceive what platforms are detecting, how they strategy youngster security, and what they might be lacking.
- Security by Design ideas that tech corporations can observe and undertake early in expertise improvement, so platforms are constructed with youngster security in thoughts from the outset.
- Sturdy detection instruments, together with AI-powered classifiers that assist establish, evaluate, and report abusive content material earlier than it spreads.
- Assist providers for victims and survivors, who usually expertise revictimization every time their abuse materials resurfaces on-line.
- Effectively-resourced regulation enforcement, geared up with the instruments and staffing wanted to establish extra youngster victims sooner.
- Authentic, youth-centered analysis to floor rising threats and guarantee we perceive how abuse evolves in digital areas.
- Cross-sector collaboration, as a result of no single actor—platform, policymaker, or nonprofit—can remedy this problem alone.
Closing ideas
Regardless of how the numbers change yr over yr, Thorn’s mission stays steadfast: To rework the best way kids are shielded from sexual abuse and exploitation within the digital age.
Actual progress requires that we complement what we be taught from these numbers by listening to what kids are experiencing right this moment, investing within the programs that may shield them tomorrow, and addressing the threats hiding beneath the floor of the information.
Learn the complete 2024 CyberTipline Report right here, and go to thorn.org to be taught extra about how we’re constructing a safer world for kids.