Apple Update more Helpful or Harmful?

Posted by

On August 5th, Apple announced a new feature for the upcoming iOS 15, iPad OS 15, WatchOS 8 and MacOS Monterey software updates, designed to detect images or videos of child exploitation stored on their device.

To prevent false positives and hide the images of abuse, Apple took a complex approach. According to Apple, the new software reduces each photo to a unique set of numbers — a sort of image fingerprint called a hash — and then runs them against hashes of known images of child abuse provided by groups like the National Center for Missing and Exploited Children.

If 30 or more of a user’s photos appear to match the abuse images, an Apple employee reviews the matches. If any of the photos show child abuse, Apple sends them to the authorities and locks the user’s account. Apple said it would turn on the feature in the United States over the next several months.

The New York Times has said that for many technologists, Apple has opened a Pandora’s box. The tool would be the first technology built into a phone’s operating system that can look at a person’s private data and report it to law enforcement authorities. Privacy groups and security experts are worried that governments looking for criminals, opponents or other targets could find plenty of ways to abuse such a system.

Apple maintains that its system is built with privacy in mind, with safeguards to keep the company from knowing the contents of our photo libraries and to minimize the risk of misuse.

Looking for more privacy news? Read more at our blog!