2024's been huge for AI, with OpenAI's ChatGPT kicking off plenty of conversations about AI’s place in our lives. But not all creators are on board, especially with big tech using their content to train AI models without asking. Some are even using tools like Nightshade to stop that from happening. To address these concerns, YouTube’s giving creators more control over how third-party companies can use their content for AI training. Rob from TeamYouTube announced:
Over the next few days, we'll be rolling out an update where creators and rights holders can choose to allow third-party companies to use their content to train AI models directly in Studio Settings under 'Third-party training.'
By enabling this feature, creators grant permission for their videos to be used by companies like xAI, Apple, Amazon, Anthropic, Meta, Microsoft, Nvidia, OpenAI, and others to train their AI models. However, not all videos are eligible. To qualify, the following conditions must be met:
- The rights holders of the video permit third-party training.
- The video's privacy setting is public.
- The video complies with YouTube’s Terms of Service and Community Guidelines.
It seems a lot of people aren't too happy with big tech companies using their content to train AI models. Take Bluesky users, for example. They were pretty upset when a machine learning expert released a dataset containing one million Bluesky posts. Many users had joined Bluesky to escape platforms like X (formerly Twitter), where Elon Musk's xAI uses user posts to train its AI, Grok. They thought they'd found a safer space, but this incident made them realize that even on Bluesky, their content could be used without their consent.
Recently, a former OpenAI researcher, Suchir Balaji, who had raised issues about AI training practices, tragically passed away. Before his death, Balaji was involved in a significant legal case against OpenAI and Microsoft, alleging misuse of copyrighted articles to train ChatGPT models. This lawsuit could have major implications for AI companies, potentially leading to limited data availability and significant financial penalties.
In the UK, nearly 40 creative groups, including publishers, authors, and photographers, are urging the government to enforce copyright protections as they enter a consultation on AI and creative industries. The Creative Rights in AI Coalition advocates for a licensing market to allow fair use of creative content in generative AI, ensuring that content creators maintain control over their work and compensation.
In August 2024, U.S. artists scored a victory in a landmark AI copyright case. A district judge ruled that companies like Stability AI, Midjourney, DeviantArt, and Runway AI were violating artists' rights by using their works without permission to train AI models.
0 Comments - Add comment