The non-profit research group, the Center for AI and Digital Policy (CAIDP), has filed a complaint with the US Federal Trade Commission (FTC) asking the agency to investigate OpenAI and halt the commercial deployment of its large language models (LLMs) including ChatGPT until guardrails are put in place. According to the organization, ChatGPT violates federal consumer protection law and should be made to follow FTC guidance for AI products.
According to the summary of the FTC filing, GPT-4 is biased, deceptive, and a risk to privacy and public safety. It warned that the outputs cannot be proven or replicated and no independent assessments have been made prior to deployment. CAIDP highlighted the FTC’s guidance on AI which states that AI should be “transparent, explainable, fair, and empirically sound while fostering accountability.” CAIDP says GPT-4 doesn’t meet any of these requirements.
While the filing names GPT-4 specifically, the CAIDP actually goes on to say that the FTC should ensure all commercial AI products in the United States are subject to independent oversight and evaluation. This would also affect other tools that use large language models such as Bing Chat and Google Bard.
Commenting on the matter, Merve Hickok, Chair and Research Director of CAIDP, said:
“We are at a critical moment in the evolution of AI products. We recognize the opportunities and we support research. But without the necessary safeguards established to limit bias and deception, there is a serious risk to businesses, consumers, and public safety. The FTC is uniquely positioned to address this challenge.”
Anyone who has used ChatGPT or similar generative AIs has probably seen the mistakes they can blurt out from time to time. While these mistakes may lose you a few marks on the homework you’re cheating on, it’s probably not going to cause too much damage. With that said there are legitimate concerns with these products and OpenAI even outlines some of these, so it will be interesting to see whether the FTC does start enforcing rules more rigidly.