People have been sharing screenshots of a new message they’ve been receiving from Facebook if they or someone they know has been looking at content that Facebook deems to be extremist. Once a user receives a notification, they can dismiss it or hit the Get Support button which presumably links to national anti-extremist organisations, although this is unclear as Neowin hasn’t been presented with the notification yet.
um what? pic.twitter.com/H9o7HV2XZF
— Jenna Ellis (@JennaEllisEsq) July 1, 2021
Commenting on the development, a Facebook spokesperson emailed the following message to Reuters:
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk. We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”
It said that the attempt to intervene in radicalisation is a part of its commitment to the Christchurch Call to Action campaign, a reference to the mass shooting at Al Noor Mosque by the fascist Brenton Harrison Tarrant. While the warning will definitely be shown to combat far-right ideologies such as Tarrant’s, it’s unclear exactly what type of politics Facebook considers extremist.
While Facebook does pro-actively remove rule-breaking content from its website, it warned that it cannot always delete it before people have already seen it. With these warnings, it will be able to warn people that they’ve seen potentially extremist content in an attempt to stop them from getting lured into the different ideologies.
Reuters says that this feature is currently being trialled in the United States as part of a small test. Once Facebook has ironed out any issues, it should launch globally.