Apple confirms that they will roll out new technology, NeuralHash, that will scan iCloud Photos for images of child abuse. According to the tech giant, this new technology will allow the company to spot and report known child sexual abuse material (CSAM) to law enforcement in a way it claims will protect user privacy while scanning for abusive material.
In an article by TechCrunch, Apple has stated that CSAM detection is one out of several new tools and features being rolled out by them with the aim of protecting children who use its services from any kind of harm. These features include filters to block sexually explicit photos sent and received through the iMessage app, especially on accounts registered to children. Other protective tools include a feature that will intervene whenever any user attempts to search for CSAM-related content or terms through Search and Siri.
The news of Apple’s new technology first came to light after a series of tweets by Matthew Green, a professor of cryptography at the John Hopkins University in Baltimore, Maryland.
The news of this new technology was met with mixed feelings especially from privacy advocates and security experts. Simultaneously, many users, who are accustomed to Apple’s standards of privacy and security as opposed to other companies, are enthusiastic about this new rollout.
What is NeuralHash?
Simply put, Apple’s new tool for CSAM detection – NeuralHash – works on a user’s device, and can identify if a user uploads known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks, to verify the content, are cleared.
It works by converting photos on a user’s iPhone or Mac into a unique string of numbers and letters – known as a hash; hence the name NeuralHash. Before an image can be uploaded to iCloud Photos, those hashes are matched on the device against a database of known hashes of child abuse imagery. The database of hashes are provided by various organizations like the National Centre for Missing and Exploited Children (NCMEC), among various others. Apple’s tool will use ‘private set intersection, a cryptographic technique, to detect a hash match without revealing what the image is or alerting the user.
After this process, even though the results are uploaded to Apple, they cannot directly read them. Apple employs another cryptographic technique called ‘Threshold secret sharing’, by which they can only decrypt the contents if a user crosses a specified threshold of known child abuse imagery in their iCloud Photos.
“Apple would not say what that threshold was, but said — for example — that if a secret is split into a thousand pieces and the threshold is ten images of child abuse content, the secret can be reconstructed from any of those ten images.”
via the TechCrunch article written by Zack Whittaker
At the point where the said threshold is crossed, Apple can manually decrypt the matching images to verify the contents. If the contents do relate to CSAM, the user’s account will be disabled and the issue will be reported to NCMEC, which is then passed on to law enforcement.
Addressing concerns
Even though this is all being done to combat child sexual abuse in a time where every person is so dependent on technology, many would obviously feel a little uneasy handing over a part of the surveillance to an algorithm. Many are calling for a public discussion about NeuralHash before the technology rolls out.
Despite all the concerns, Apple has been trying their best to calm fears by “baking in privacy” through multiple layers of encryption, made in such a way that requires multiple steps before it ever makes it to the hands of Apple’s final, manual review. Even if it reaches the point of manual review, Apple reaffirmed that the manual review would confirm the correct outcome.
Most cloud services like Google, Dropbox, Microsoft, etc. already scan their users’ files for content that violates any of their terms of service, which includes CSAM. However, Apple has long resisted the idea of scanning user files in the cloud, owing to how seriously they take data security. They even give users the option of encrypting their data before it even reaches Apple’s iCloud servers. So the question that needs to be asked is why now?
Recently, there has been a lot of government pressure on tech giants like Apple to weaken or “backdoor” the encryption used to protect users data to allow law enforcement to investigate any serious crimes. Even though companies have refused to do this, there has been a lot of resistance against the efforts to shut out government access. Only last year, Reuters reported that Apple dropped a plan for encrypting users’ full phone backups to iCloud after the FBI complained that it would “harm investigations”.
It is yet to be seen how effective NeuralHash will be if it is implemented. It is also unknown if the concerns of experts and the general public will be addressed in detail.
NeuralHash will be packaged into both iOS 15 and macOS Monterey, which will be released in a few months. Apple has also published a document with all the technical details about how their new tool works. The tech giant has said that NeuralHash will officially roll out in the United States first. Although, it is not clear when the technology will be rolled out internationally.
Add Comment