The European Commission is putting pressure on major tech platforms to be more transparent about how their algorithms recommend content to users. Today, the Commission sent information requests under its new Digital Services Act (DSA) to YouTube, Snapchat, and TikTok.
The Commission wants to know the details of the inner workings of the recommender systems employed by companies and how any potential risks associated with it were mitigated. This includes illegal content, such as drugs and hate speech, and supposed harm to elections, civic discourse, and the well-being of minors.
Both YouTube and Snapchat would need to provide information on the standards powering their recommendation algorithms, as well as steps taken to reduce content "rabbit holes" that may have adverse effects on the mental health of users. TikTok has been asked to send policies that prevent coordinated inauthentic behavior for influencing elections or debates.
TikTok has been requested to provide more information on the measures it adopted to avoid the manipulation of the service by malicious actors and to mitigate risks related to elections, pluralism of media, and civic discourse, which may be amplified by certain recommender systems.
All three companies have until November 15 to reply to the Commission"s queries. The responses will help regulators determine whether the firms are compliant with DSA rules on transparency around algorithmic amplification of risks. Under the new law, non-cooperation or incomplete answers could result in fines.
The rules under the DSA also introduce far-reaching transparency requirements for digital platforms, which have more than 45 million monthly active users. A senior EU official says the Commission"s investigation sends an urgent message to change their practices involving recommendation systems.
These demands are part of the Commission"s ongoing platform recommendation systems scrutiny, which has been upped a notch since the DSA came into force. Formal proceedings of non-compliance against Facebook, Instagram, AliExpress, and TikTok are also currently underway for failing to provide "devised recommender guidelines" and mitigation of risks related to the respective guidelines.