Big Tech companies still taking "voluntary action" to detect child sexual abuse material – despite no EU legal basis for proactive scanning ...
As digital rules on detecting child abuse material expire and lawmakers clash over new powers, tech companies keep monitoring ...
Experts warn lapse could sharply reduce reports of abuse, echoing a 58% drop during a similar legal gap in 2021 ...
The European Parliament has voted against allowing tech platforms to scan for online child abuse, sending shock waves through the industry.
An international law enforcement action called Operation Alice has shut down over 373,000 dark web sites that offered fake CSAM packages. The investigation, led by Germany and supported by Europol, ...
As Testut and Shane explain: As you may have heard, over the last few weeks X and Grok have made it possible for child sexual abuse material (CSAM) to be generated and widely distributed on their apps ...
In a new privacy versus illegal content battle, West Virginia is suing to force Apple to crack down on child sexual abuse material (CSAM) stored on iCloud, which offers end-to-end encryption. Apple ...
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
For weeks, xAI has faced backlash over undressing and sexualizing images of women and children generated by Grok. One researcher conducted a 24-hour analysis of the Grok account on X and estimated ...
ABC4 Utah on MSN
Man charged for allegedly running CSAM file-sharing program out of Syracuse residence
A man living in Syracuse is accused of running a file-sharing program that allowed online users to download child sexual abuse material (CSAM) files from him thousands of times.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results