Apple has decided to postpone plans for the rollout of a child safety feature that scanned hashes of iCloud Photos uploads in order to determine if users are storing child sex abuse material (CSAM).
Apple has decided to postpone plans for the rollout of a child safety feature that scanned hashes of iCloud Photos uploads in order to determine if users are storing child sex abuse material (CSAM).