A new lawsuit filed in Tennessee is raising urgent questions about what happens when powerful generative AI tools are accused ...
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
Children’s rights groups and tech companies are fuming after EU legislators could not agree to extend rules allowing tech companies to continue scanning the internet for child sexual abuse material ...
For years, hashing technology has made it possible for platforms to automatically detect known child sexual abuse materials (CSAM) to stop kids from being retraumatized online. However, rapidly ...
Over thousands of CSAM (child sexual abuse materials) victims are now taking the fight against Apple after the company ultimately decided to skip adding tools that will help detect it on their ...