How Will the FTC Respond to Pressures to Regulate Social Media?
The White House and Congress have been pushing on the Federal Trade Commission (FTC) to get more involved in regulating social media, as the agency has been looking more closely at platform liability recently. The White House’s May 28 Executive Order on Preventing Online Censorship (EO) directs the FTC to consider taking actions to review online speech policies by social media platforms, by using its FTC Act authority and/or issuing reports on speech restrictions and consumer complaints. Congressional members have encouraged closer scrutiny of apps with large numbers of child users and foreign-owned apps that are paying consumers to use the app and collecting users’ data.
The FTC is an independent agency with a 5-member Commission often has acted on a bipartisan basis. How will the agency respond to this tide of pressure? The team at Wiley has been engaged at the FTC and other agencies and is involved in Section 230 liability issues,[1] and my partner Megan Brown and I wanted to give blog readers a sense of what comes next, drawing on drawing on my previous experience as an FTC Assistant Director. Below I break down questions from Megan on some key issues.
What does the Executive Order direct the FTC to do?
The EO declares a “policy of the United States that large online platforms . . . should not restrict protected speech.” Among many other things, it directs the FTC to “consider taking action… to prohibit unfair or deceptive acts or practices in or affecting commerce” that “may include practices by [platform] entities . . . that restrict speech in ways that do not align with those entities’ public representations about those practices.” It also directs the FTC to consider whether complaints against “large online platforms . . . allege violations of law that implicate” the EO’s policy on platform speech, and It directs the FTC to “consider developing a report describing such complaints and making the report publicly available.”
Bottom line, the President is hoping that the FTC accepts his invitation to investigate the practices of social media companies and possibly bring enforcement action against them. Even short of a full investigation, the FTC could use its authority to pursue an inquiry into social media and platform practices. The FTC is authorized “to gather and compile information concerning, and to investigate from time to time the organization, business, conduct, practices, and management of any person, partnership, or corporation engaged in or whose business affects commerce…”
That sort of agency action is a big deal and could impose burdens on companies, even those not directly targeted, because of the potential reach of the FTC’s inquiry.
How does this fit with the FTC’s priorities?
It’s an awkward fit. On one hand, the FTC has reached settlements with a number of online platforms about privacy issues. On the other hand, it doesn’t generally investigate private company policies around what speakers or content is allowed on a company’s platform.
For example, the FTC enforces the Consumer Review Fairness Act, which bars companies from using contractual clauses that restrict the ability of consumers to post reviews of a product. In that instance, the FTC is acting pursuant to specific direction from Congress, and the law relates directly to advertising in commercial transactions – a core mission of the agency.
In contrast, the agency treads carefully when its actions can impact private speech, as it could in the case of regulating social media platforms. Just a few years ago, an enforcement action against a marketer of a weight-loss product who appeared on “The Dr. Oz Show” generated a 3-2 Commission split over whether the amount of monetary relief would chill protected non-commercial speech. Going further back to the 1970s, the Commission was widely and famously criticized for attempting to regulate certain advertising to children – the so-called “KidVid” episode that resulted in funding cuts to the agency and a widespread overhaul of the Commission’s legal approach to its FTC Section 5 authority.
What will the FTC do now?
The FTC has a number of options going forward. For example, it could hold a workshop and seek public comments on how to move forward, as it has done is a wide range of areas (most notably privacy). It could send out “6(b) orders” which gather data from market participants which can be discussed in a report. And, if it was so inclined, it could launch investigations in anticipation of potential enforcement actions, though those are normally non-public.
Of note, Commissioner Christine Wilson has called for the FTC to look more closely at issues raised in the EO, and has emphasized the role and effect of algorithms that may not be entirely transparent. This is a concern that she shares with Commissioner Chopra, and Commissioner Slaughter has separately talked about her concern with potential negative effects from algorithms. This may create common ground for the agency to look more closely at algorithms and AI used on social media platforms.
Does this mean the agency is likely to revisit platform liability for third party activities?
The digital economy has been built by online platforms making information, services, and goods available from third parties. We are seeing multiple efforts to revisit that model and impose liability for third party conduct, so retailers, online communities, and social media platforms should carefully watch the FTC’s moves here; as I observed late last year, “the FTC has suggested that it will look ‘up the chain’ at platforms to determine if they should be held liable for misconduct by” third parties. As a result, “[c]ompanies should pay close attention to their interactions with third parties that may invite scrutiny from the FTC, and recognize that this Commission’s expectations are increasing.”
What about social media apps and online services used by children?
Protecting the privacy of kids who use social media apps is already a big priority for the agency. In one recent settlement, the Chairman and Commissioner Wilson even touted that the consent order would impose obligations beyond those required by law and change the company’s business model. They acknowledged that under the statute, “third parties that … do not themselves create the content – are not responsible” but that the settlement “now makes Defendants responsible for creating a system through which content creators must self-designate if they are child-directed. This obligation exceeds what any third party in the marketplace currently is required to do.”
Given its authority under the Children’s Online Privacy Protection Act (COPPA), we can expect more interest in this area.
The potential for kids’ – or adults’ – data to be captured by a company with ties to a foreign government adds a national security layer that the FTC does not often face. That said, the FTC recently brought an enforcement action against a mobile device manufacturer that it alleged failed to protect consumer data – including by allowing a China-based third-party service provider to collect detailed personal information without consent. As the FTC focuses on whether any suspected privacy violations cause consumer harm – as opposed to being used to provide beneficial services like tailored advertising – national security concerns may well affect its calculus.
While the FTC’s status as an independent agency gives it greater leeway in responding to Executive and Congressional pressure, the focus on social media regulation – particularly coming from both sides of the political aisle – likely to affect agency priorities over the next few months and beyond.
[1] Section 230 of the Communications Decency Act, 47 U.S.C. § 230, provides certain liability protections for online platforms and is a focus of the EO.