The Work Never Stops: A First Look at NCMEC's 2025 Data
At NCMEC, the work to find missing children and combat online child sexual exploitation never stops. Each year, our data reflects the scope of that mission and reminds us that behind every number is a child who needs help. While NCMEC’s full 2025 Impact Report will be released soon, this snapshot offers an early look at key trends shaping our work.
Every year, NCMEC witnesses new and disturbing threats to children online. Since the CyberTipline was created, NCMEC has responded to more than 226 million reports relating to child sexual exploitation, including more than 300,000 reports related to child sex trafficking. In 2025 alone, the CyberTipline received 21.3 million reports that included more than 61.8 million images, videos and other files related to suspected child sexual exploitation.
Among the 2025 trends was a sharp rise in reports related to generative AI (GAI) technology. In 2025, NCMEC received more than 1.5 million CyberTipline reports, indicating a nexus to GAI and child sexual exploitation. These reports included images and videos that NCMEC staff assessed and labeled as AI-generated, as well as other files, chats and activity where GAI was connected to the exploitation in some way.
Of those 1.5 million reports, 1.1 million were submitted by Amazon AI Services and contained no actionable information. Excluding those submissions:
- More than 12,000 reports involved child sexual abuse material (CSAM) identified within AI training data;
- Over 7,000 reports involved users generating or possessing AI-generated CSAM;
- NCMEC also received more than 30,000 reports of users attempting to generate CSAM by uploading images and using text prompts;
- More than 145,000 reports of users using AI tools to alter or manipulate CSAM files without prompts;
- An additional 3,000 reports involved other forms of AI-facilitated exploitation, such as chat-based grooming or abuse;
- More than 133,000 reports indicated a GAI nexus but lacked sufficient information to determine how the technology was used.
Together, these reports underscore the rapidly evolving role of GAI in child sexual exploitation and the growing urgency for coordinated action across technology platforms, law enforcement, policymakers and child safety organizations.
At the same time, NCMEC saw significant changes in child sex trafficking reporting. As a direct result of the REPORT Act, NCMEC is beginning to see a more realistic reflection of the number of children being trafficked online for sex. In 2023, one year before the REPORT Act was enacted, online platforms submitted 8,480 CyberTipline reports relating to child sex trafficking. In 2025, the first full year after implementation, online platforms submitted 105,877 reports – a more than 1,100% increase that underscores the impact of expanded reporting requirements.
In addition to the threats posed by emerging technologies, reports of online enticement, including sextortion, continue to rise. In 2025, NCMEC received 1.4 million reports of online enticement, a 156% increase from 2024.
Over the last 42 years, NCMEC's mission to bring home missing children has never wavered. In 2025, NCMEC assisted law enforcement, families and child welfare with 32,167 reports of missing children. Of those reported missing, 1 in 7 children were likely victims of child sex trafficking. The overall recovery rate for children reported missing to NCMEC was 90%.
Beyond casework, NCMEC continued to provide critical support to children, families and professionals across the country. In 2025, NCMEC’s call center responded to more than 138,000 calls for help from the public, families and law enforcement. NCMEC also delivered nearly 42,000 trainings to child-serving professionals, equipping communities with the tools to better protect children.
This snapshot represents just a portion of the data that will be included in NCMEC’s full 2025 Impact Report. In the months ahead, we will share a deeper look at the numbers, the trends shaping child protection, and the stories behind the work.