Fears Arise Over Apple's New Program to Hunt for Child Sexual Abuse Material
Motherboard maker Gigabyte hit by RansomEXX ransomware gang, U of Kentucky discovered a breach during pentest, Pulse Secure issues patch for critical RCE flaw, Farm equipment plagued by flaws, more
Check out my latest CSO column about how the fears of Apple’s new policy to scan its users’ photos to hunt for child sexual exploitation material revives fears of backdoor access.
A raging controversy erupted on Friday and continued throughout the weekend regarding Apple’s decision to scan users’ images for Child Sexual Abuse Material (CSAM). Under the new program, Apple will actively hunt for CSAM material in users’ photos uploaded to Apple Cloud using a novel encryption technique that the company says protects user privacy.
However, top technologists, cybersecurity experts, and privacy advocates contend that the new program provides an inroad for governments to gain access to all kinds of user content, effectively creating the environment that advocates for encryption back doors have long sought.
Top executives for WhatsApp and Epic Games accuse Apple of building a system to scan personal data and report it to the government. Over 4,000 technologists, security and privacy experts, and…
Keep reading with a 7-day free trial
Subscribe to Metacurity to keep reading this post and get 7 days of free access to the full post archives.