Preventing the sharing of child sexual abuse material (CSAM) has become an important priority for many firms moderating the flow of digital content recently. It now appears that social media platform TikTok is being investigated in the U.S. for its lack of moderation in this area.
Financial Times reports that TikTok is currently the subject of a Department of Homeland Security (DHS) investigation that is probing how the platform handles CSAM. Similarly, the Department of Justice (DoJ) is looking into how a specific privacy feature on TikTok is being misused by predators.
Although TikTok has 100,000 human moderators globally, the DHS says that it is still the platform of choice for predators due to its younger audience. During 2019-2021, investigations into TikTok-related child exploitation cases have also increased by a factor of seven.
TikTok is reported as saying:
TikTok has zero-tolerance for child sexual abuse material. When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.
We are deeply committed to the safety and wellbeing of minors, which is why we build youth safety into our policies, enable privacy and safety settings by default on teen accounts, and limit features by age.
Regarding the privacy feature that is being investigated by the DoJ, Financial Times says that this is likely the use of private accounts to trade CSAM by sharing passwords between predators. Procured illegal content is then uploaded using the "Only Me" functionality where it is only visible to those logged into the profile.
Despite all of these issues, TikTok is still doing pretty well in the financial space. Its projected ad revenue for 2022 is more than Twitter and Snapchat combined. It remains to be seen how the latest investigations will impact its operations.
Source: Financial Times