Apple will reportedly deploy a new technology to see what kind of photos you have in the Gallery of your iPhone to identify whether or not you are storing child pornographic content. Apple will be using hash algorithms to check through the photos stored in user’s iPhones and will use photo identification software in the backend to recognise whether or not it looks like child pornography or any other kind of abuse.
Apple will reportedly roll out a “client-side tool for CSAM scanning”. This means your iPhone will automatically download these hash algorithms that will check each and every photo saved in your iPhone to identify whether there’s any illegal content. If the algorithm spots any objectionable content, the iPhone will automatically report it to Apple servers if there are too many of such photos.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning… https://t.co/Ldz40X4COw
— Matthew Green (@matthew_d_green) 1628121584000
The problem is hashing algorithms are not always accurate and it may give false positive reports. Also, while Apple is claiming to detect only illegal child abuse content, it could be possible that Apple may flag other types of content as well. “Apple allows governments to control the fingerprint content database, then perhaps they could use the system to suppress political activism,” as per a report by 9to5 Mac.
Even if Apple stores content on iCloud in an encrypted way, the problem is that Apple also owns the keys to decrypt. This means if Apple is forced by any law enforcement agency then Apple may allow governments to look through all photos of a particular user.
“The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal,” said cryptography and security expert Matthew Green.
Governments across the world have been asking for technologies to bust E2E communications as law enforcement agencies are having a tough time to decrypt such communications. The only hope here is that Apple would not let its systems be misused. “But even if you believe Apple won’t allow these tools to be misused there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review,” added Green.