The increase of what is known as "deepfake" images and content, especially ones of an explicit nature, continues to grow on the internet. Today, Google announced new ways to try to keep these harmful deepfake photos and content from showing up on its search engine.
In a blog post, Google announced that it has made improvements in how people can report explicit, deepfake images that they find in Search. After a successful removal request has been made, Google will not only remove the objectionable image but also scan and remove any duplicates of that image that it finds in its search results. In addition, the company says it will also remove any explicit search results that are similar to the original reported image.
Google is rolling out new rankings for its search results, which are aimed at cutting down on this kind of content. It states:
For queries that are specifically seeking this content and include people’s names, we'll aim to surface high-quality, non-explicit content — like relevant news articles — when it’s available. The updates we’ve made this year have reduced exposure to explicit image results on these types of queries by over 70%. With these changes, people can read about the impact deepfakes are having on society, rather than see pages with actual non-consensual fake images.
Google also says that it is working on improvements that will show consensual and real explicit content, such as images from a film featuring an actor doing nude scenes, rather than ones that were created to show fake explicit content from that same actor.
Finally, Google will now be demoting sites that have a high number of explicit deepfake images from its search results. The company says that even with these changes, it will continue to try to improve its algorithms so that more action is taken against this content. It will also partner with other companies to fight against explicit deepfakes outside of internet search results.
0 Comments - Add comment