Apple has decided to postpone plans for the rollout of a child safety feature that scanned hashes of iCloud Photos uploads in order to determine if users are storing child sex abuse material (CSAM).
Neuralhash RSS
Reddit user u/AsuharietYgvar is confident that they have uncovered Apple's NeuralHash algorithm, which will combat child sexual abuse, deep in iOS' source code. A GitHub repo contains their findings.