The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that content was ...
AltStore developers announced CSAM Store Checker, an app that tracks app marketplaces where child sexual abuse material is ...
For many years, credit card companies and other payment methods were aggressive about policing child sexual abuse material.
State Attorneys General (AGs) nationwide are increasing enforcement activity focused on online child sexual abuse material (CSAM) and the use ...
Apple has had quite the rollercoaster ride over plans to scan devices for the presence of child sexual abuse materials (CSAM). After announcing and then withdrawing its own plans for CSAM scanning, it ...
Grok, X’s AI chatbot, produced child sexual abuse material (CSAM) on demand to users recently, writing an apology note on New Year’s Day that read “I deeply regret an incident on Dec 28, 2025, where I ...
Irish culture minister Patrick O'Donovan says that X is not responsible for the child sexual abuse material generated by it on-demand, stored on its servers and sent by it to users. The viewer is ...
It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users generating content that the platform deems illegal, including Grok-generated ...