OpenAI sent 80 times as many child exploitation incident reports to the National Center for Missing & Exploited Children during the first half of 2025 as it did during a similar time period in 2024, according to a recent update from the company. The NCMEC’s CyberTipline is a Congressionally authorized clearinghouse for reporting child sexual abuse material (CSAM) and other forms of child exploitation.
Companies are required by law to report apparent child exploitation to the CyberTipline. When a company sends a report, NCMEC reviews it and then forwards it to the appropriate law enforcement agency for investigation.
Statistics related to NCMEC reports can be nuanced. Increased reports can sometimes indicate changes in a platform’s automated moderation, or the criteria it uses to decide whether a report is necessary, rather than necessarily indicating an increase in nefarious activity.
Additionally, the same piece of content can be the subject of multiple reports, and a single report can be about multiple pieces of content. Some platforms, including OpenAI, disclose the number of both the reports and the total pieces of content they were about for a more complete picture.
Once again ArsTechnica pushing shit misleading headlines as usual.
(To be clear to anyone reading, you may want to explain your quotes are from the article)
While it’s true the total number of reports is higher than the count for individual pieces of content, it’s hardly a “shit misleading headline” given that the count of content reported is still 22x higher than this time last year (74559 vs 3252 individual pieces of content).
Ars has a pretty good point, and oddly the only reason it’s 80x higher is that in 2024h1 they had 1/3 as many reports as reported content, and in 2025h1 the reported content to number of reports is pretty much 1:1. That disparity is interesting, and I would like to know more. If only someone had written quite a good article about it that goes into much greater depth about the causes.
(they also explicitly list the numbers I cite here, in context with the report numbers - so doing a better job than I am. I encourage people to read the article on this one, its actually quite good)
I’m unclear how this is a misleading headline. There were more reports; that’s all it says. The headline doesn’t give any indication or even hint of why, just that volume = greater.
My guess before reading anything was that it was getting better at identifying and reporting cases and/or as use of their tools has increased that there is more to report, not that suddenly there was more CSAM occurring. Did you assume something different?
Either way, I’m glad to hear the reports are increasing. Makes it more likely the culprits will be identified.



