Apple is said to be scanning your iPhone for child abuse images

According to credible reports, Apple plans to begin installing software on iPhones in the US that will automatically search local photos for images of child abuse. Apple will reportedly only scan photos uploaded to iCloud using this technology – at least initially.

To update: Apple got much of that coverage in. confirmed a detailed document.

What is Apple doing?

The reports on Apple’s plans come from The Financial Times and Johns Hopkins University Professor Matthew Green, both generally reliable sources. Of course, until Apple confirms this, there is always the possibility that this will not happen. Apple reportedly demonstrated the plan to some US academics earlier this week.

According to the reports, Apple will use a system called “neuralMatch” to scan American iPhones for images of child abuse.

In essence, an automated system would alert a team of human examiners if it believed that illegal images were discovered. From there, a member of the team would review the images and contact law enforcement.

Technically, this is nothing new – Cloud-based photo storage systems and social networks already perform this type of scanning. The difference is that Apple does this at the device level. According to Matthew Green, it initially only scans photos uploaded to iCloud, but it would do that scan on the user’s phone. “First of all” is the keyword there with which one could very well scan all photos locally at some point.


This is supposed to make the system less invasive as the scanning is done on the phone and is only sent back if a match is made, which means that not every photo you upload is exposed to the eyes of strangers.

According to the participants, every photo that has been uploaded to iCloud receives a “security voucher” that states whether it is suspicious or not. When a certain number of images are flagged as suspicious, Apple will decrypt them and send them to authorities if something is found to be child abuse.

How will the system differentiate between child abuse images and other images? According to the report, it was tested on 200,000 sexual abuse images collected by the U.S. nonprofit National Center for Missing and Exploited Children.

Images are hashing into a sequence of numbers and then compared with the images in the database.

After all of this came out, Apple declined to comment on the events to the Financial Times. However, we assume the company is working on an official statement before the news about the move gets out of hand.

This sets a dangerous precedent

We probably don’t need to tell you how scary this can be. People who abuse children should be caught and punished, but it is easy to see how such a thing could be used for far more invasive purposes.


Is similar technology being rolled out on other platforms such as Macs, Windows PCs, and Android phones? Could countries like China use it to recognize subversive images on their citizens’ cell phones? If it is widely accepted, could the copyright industry use it to search for pirated content in a few years?

And even if it works as advertised: Will innocent people get caught in the crossfire?

Hopefully this isn’t as worrying as it looks.

Related Posts