Social media platforms such as Facebook and Twitter have been moderating information related to COVID-19 and elections for quite some time on their respective platforms. Other websites like YouTube are doing the same as well. However, there has recently been increased pressure from the White House to "do more" since U.S. President Joe Biden claimed that fake news related to COVID-19 on social media is "killing people". To reduce the spread of misinformation on topics like COVID-19, vaccines, and elections, Twitter is now testing a process which allows users to report tweets that contain inaccurate information.
As reported by TechCrunch, Twitter users can now utilize the existing drop-down menu on the top-right corner of every tweet to report them for spreading misleading information. While misinformation in general can be reported as well, explicit options about "politics" and "health" are listed too. The former will include options related to the elections while the latter will feature choices related to COVID-19.
It is important to note that the feature is not generally available yet and is still in testing phases. As such, it will only be available to most of Twitter's users in the United States, Australia, and South Korea. The company expects to continue this experimentation for a few months before it decides on whether or not to generally roll it out to other countries.
The implication from the firm right now is that not every flagged tweet will be reviewed since it will go through the existing moderation system, which is a combination of AI and humans. Similarly, flagged tweets will be assigned a priority too. If they are from someone with a large following, they will be given a higher priority, and vice versa. Misinformation related to COVID-19 and elections will be given precedence over other topics as well.
Via: TechCrunch
3 Comments - Add comment