CES 2025: FTC Commissioners Discuss Approach to AI in New Administration
On Thursday, January 9 at CES 2025, two sitting FTC Commissioners – Rebecca Slaughter (D) and Melissa Holyoak (R) – sat down for a revealing discussion moderated by two former FTC Commissioners now in the private sector – Christine Wilson (R) and Julie Brill (D) – about the FTC’s approach to AI. The discussion provided context for FTC actions related to AI over the past few years, as well as a preview of what may come in a new Administration. Here are some key highlights:
- The current Commissioners agreed that the FTC should continue to bring cases involving AI-related fraud and deception. Commissioner Holyoak emphasized that bringing fraud cases is pro-innovation and pro-growth, because it helps to build consumer trust in law-abiding companies using AI. She also noted that the FTC could bring privacy-related deception cases if a company makes misrepresentations about the collection and use of personal data in connection with AI models. Commissioner Slaughter similarly favored deception enforcement actions, and added that she favored more criminal referrals in cases involving outright fraud, in order to maximize deterrence.
- The Commissioners generally agreed that the development of general purpose AI tools that are misused by third parties, without additional involvement or knowledge by the developer, should not be subject to FTC Act liability. However, they disagreed on the level of developer involvement or knowledge that could give rise to liability. This was the basis of disagreement in the FTC’s Rytr settlement, in which the FTC majority (including Commissioner Slaughter) argued that the developer of a generative AI tool that could be used to generate large numbers of deceptive reviews could be held liable, based on the facts of that case. Commissioner Holyoak dissented, and argued that the AI tool itself was neutral, and that holding the developer liable would discourage innovation. She pointed instead to the recent Sitejabber settlement, which she supported and which involved a tool that was allegedly designed to generate deceptive reviews. With the change in Administration, Commissioner Holyoak will now be in the majority, and the question of where to draw the line – between a neutral tool or a product designed for misuse – will be worth watching going forward.
- The Commissioners expressed concern with voice cloning. Commissioner Slaughter noted that fraudsters had even impersonated her as part of a scam to obtain money from consumers. She pointed to the FTC’s Impersonation Rule – adopted on a bipartisan basis – as an important tool to combat impersonation scams. Commissioner Holyoak touted the FTC’s Voice Cloning Challenge – which encouraged private sector solutions to fight voice cloning fraud – as a non-enforcement approach to help address this issue.
- Finally, the Commissioners expressed concern about children’s interactions with AI – and in particular AI chatbots. Commissioner Holyoak suggested that the FTC might pursue a market study (under Section 6(b) of the FTC Act) to evaluate data collection and risks associated with kids’ interactions with AI chatbots. Commissioner Slaughter stated that the FTC had received external requests to look more closely at kids’ interactions with AI chatbots – which, she noted, presents challenging issues. As we have discussed, privacy and consumer protection efforts for children generally have bipartisan support, and we will watch to see what kinds of approaches the FTC takes in the next Administration.
Overall, the discussion among current and former Commissioners was a good reminder that much of what the FTC does has bipartisan support, and areas of disagreement can sometimes be differences in degree rather than a broad gulf. Both current Commissioners emphasized that they are pro-innovation – a fitting theme for CES – even if they have sometimes had different views on how to advance it. While some approaches may shift, the FTC appears likely to continue to be engaged on AI as we move into the next Administration.