A couple of days ago, Apple announced that it will be rolling out new technology that allows it to scan photos uploaded to iCloud using on-device machine learning and comparing their hashes to known images of child sexual abuse material (CSAM) from the National Center for Missing and Exploited Children"s (NCMEC) repository. It also stated that it would inform parents if a child under 13 years of age receives or sends sexually explicit photos.
The move has drawn a lot of attention and criticism, with an open letter protesting the matter getting over 5,000 signatories at the time of this writing.
The open letter in question can be seen here. It is addressed directly to Apple and says that while the company"s moves are well-intentioned because child exploitation is a serious issue, they create a backdoor in the ecosystem which undermines fundamental privacy rights of customers. The document further says that since the methodologies use on-device machine learning, they have the potential to break end-to-end encryption.
It also cites quotations from several organizations and security experts to emphasize that the tech is prone to misuse and undermines privacy. An excerpt reads:
Immediately after Apple"s announcement, experts around the world sounded the alarm on how Apple"s proposed measures could turn every iPhone into a device that is continuously scanning all photos and messages that pass through it in order to report any objectionable content to law enforcement, setting a precedent where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and unreasonable expansion of the scope of surveillance.
[...] The type of technology that Apple is proposing for its child protection measures depends on an expandable infrastructure that can"t be monitored or technically limited. Experts have repeatedly warned that the problem isn"t just privacy, but also the lack of accountability, technical barriers to expansion, and lack of analysis or even acknowledgement of the potential for errors and false positives.
In light of the above, the letter has demanded that Apple immediately halts its deployment of the tech and issue a statement that confirms its commitment to user privacy. It has cautioned that the rollout will undermine all the work that has been so far towards user privacy.
The open letter has currently been signed by 5,544 individuals and 31 organizations including IVPN, Gigahost, Freedom of the Press Foundation, and more. You can also become a signatory via GitHub here. For its part, Apple has already acknowledged that people are worried about its new tech, but that is because of misunderstandings that it will clarify with time.