YouTube has announced that it is shaking up its video recommendations which will see it reduce the amount of content which borders on violating its Community Standards (but doesn't quite). After the change comes into effect, users will begin seeing less content that misinforms users in potentially harmful ways.
In its announcement, YouTube discussed which content users would see less of, saying:
“[Content] such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
The firm said that this change only affects recommendations, however, if a video doesn’t violate the Community Guidelines, then the video will continue to appear as a recommendation for channel subscribers, and in search results. YouTube said it hopes this change fosters a balance between free speech and living up to its responsibility to users. The change comes after Google reaffirmed its stance against potentially harmful challenge videos.
Google will begin to filter out recommendations in the United States with the help of machine learning and real people using set guidelines to judge the quality of a video. As the system becomes more effective the firm plans to roll the change out to more countries. You won’t see these changes immediately but instead, it’ll be a gradual change as its machine learning systems improve.
71 Comments - Add comment