X has said that it will begin a hiring round of 100 full-time employees to build a new trust and safety office in Austin, Texas, replacing the team that was removed when Elon Musk took over.
Csam RSS
TikTok is reportedly being investigated by the U.S. government for not moderating child sexual abuse material (CSAM) on the social media platform and being a hunting ground for predators.
Apple has decided to postpone plans for the rollout of a child safety feature that scanned hashes of iCloud Photos uploads in order to determine if users are storing child sex abuse material (CSAM).
Reddit user u/AsuharietYgvar is confident that they have uncovered Apple's NeuralHash algorithm, which will combat child sexual abuse, deep in iOS' source code. A GitHub repo contains their findings.
Apple has addressed privacy concerns regarding its sex abuse scanning by clarifying that the new feature would only flag accounts with at least 30 iCloud photos matching Child Sexual Abuse Material.
Apple has provided more details about its child safety photo scanning technologies that have been drawing some fire from critics. It has also described the end-to-end flow of its review process.
An open letter demanding that Apple halt the rollout of its photo scanning tech and issue a statement to reaffirm its commitment to privacy now has signatures from over 5,000 individuals and firms.
In an internal memo, Apple's Software VP has acknowledged that people are worried about the company scanning iCloud Photos for child sex abuse material, but says that this is due to misunderstandings.
Apple has introduced new safety features for iPhone, iPad, and Mac to safeguard children from predators. It is working alongside the National Center for Missing and Exploited Children (NCMEC).
Google has announced a new AI tool that is now part of its Content Safety API. The new tool is intended to help organisations find and remove child sexual abuse materials on their platforms.