When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

OpenAI announces Media Manager; starts testing a deepfake image detector

openai logo

OpenAI has made a couple of announcements today that are related to how its generative AI tools are used, and how images made with those tools can be detected.

The first announcement reveals OpenAI is working on a tool called Media Manager. It's been created so that content makers and owners can inform OpenAI about their ownership, and also if they want or don't want that content to be added to the company's AI training models.

OpenAI stated:

This will require cutting-edge machine learning research to build a first-ever tool of its kind to help us identify copyrighted text, images, audio, and video across multiple sources and reflect creator preferences.

This tool is still in its early stages and it won't be officially launched until 2025. OpenAI may have been spurred to launch this tool due to the many lawsuits it is currently facing from news organizations, including The New York Times, who claim the company has been illegally using its content to train its AI models.

Today's second announcement revealed that OpenAI is working on another tool that's designed to detect if images have been made with its DALL-E AI art generator. OpenAI says:

Our goal is to enable independent research that assesses the classifier's effectiveness, analyzes its real-world application, surfaces relevant considerations for such use, and explores the characteristics of AI-generated content.

OpenAI says that an early version of this currently unnamed tool successfully identified over 98 percent of images made by DALL-E, and misidentified images by less than 0.5 percent of the time as being made by DALL-E. The tool is currently under private testing by a number of "research labs and research-oriented journalism nonprofits." There's no word on when the tool will be generally available.

OpenAI also said in the same blog post that it is joining the Steering Committee of the Coalition for Content Provenance and Authenticity. This group creates standards for certifying digital content. In addition, OpenAI and its major partner Microsoft are teaming up to fund $2 million into what the companies are calling the Societal Resilience Fund. It will be used to help fund a number of organizations that are dedicated to "AI education and understanding."

Report a problem with article
Sonic Mania Plus
Next Article

Netflix's games library grows with Sonic Mania Plus, Katana Zero, Braid, and more in May

A graphical representation of Threads logo
Previous Article

Meta is working to improve cross-posting between Threads and Instagram

Join the conversation!

Login or Sign Up to read and post a comment.

2 Comments - Add comment