Skip to content

Baldwin Joins Legislation to Strengthen Rules, Transparency For Online Content Moderation, Hold Internet Companies Accountable

Bipartisan Internet PACT Act Requires Social Media Companies to Establish Clear Content Moderation Policies, Holds Them Accountable For Not Protecting Consumers

WASHINGTON, D.C. – U.S. Senator Tammy Baldwin (D-WI) joined Senators Brian Schatz (D-HI) and John Thune (R-SD) to introduce the Internet Platform Accountability and Consumer Transparency (Internet PACT) Act, bipartisan legislation which updates the Communications Act of 1934 by requiring social media companies to establish clear content moderation policies and holding them accountable for content that violates their own policies or is illegal. The bill is also cosponsored by U.S. Senators John Barrasso (R-WY), Ben Ray Luján (D-NM), Bill Cassidy (R-LA), John Hickenlooper (D-CO), and Shelley Moore Capito (R-WV).

“Right now, content moderation policies are confusing for users and loosely enforced by online companies, which can have serious consequences for consumers, especially our kids. Americans deserve transparent content moderation policies from online companies, and those companies must be held accountable when their platform fails to meet those standards,” said Senator Baldwin. “This commonsense, bipartisan legislation will ensure internet companies set clear standards for content moderation and enforce those rules to protect consumers online.”

There is widespread bipartisan agreement that social media platforms have inconsistent and opaque content moderation practices due to a lack of accountability. To address this, the Internet PACT Act creates more transparency by:

  • Requiring online platforms to explain their content moderation practices in an acceptable use policy that is easily accessible to consumers;
  • Implementing biannual reporting requirements for online platforms that includes disaggregated statistics on content that has been removed, demonetized, or deprioritized; and
  • Promoting open collaboration and sharing of industry best practices and guidelines through a National Institute of Standards and Technology-led voluntary framework.

The Internet PACT Act holds platforms accountable by:

  • Requiring large online platforms to provide due process protections to consumers through a defined complaint system that processes reports and notifies users of moderation decisions within twenty-one days, and allows consumers to appeal online platforms’ content moderation decisions;
  • Amending Section 230 to require that large online platforms remove court-determined illegal content and activity within four days; and
  • Allowing smaller online platforms to have more flexibility in responding to user complaints, removing illegal content, and acting on illegal activity, based on their size and capacity.

The Internet PACT Act protects consumers by:

  • Exempting the enforcement of federal civil laws from Section 230 so that online platforms cannot use it as a defense when federal regulators, like the Department of Justice or Federal Trade Commission (FTC), pursue civil actions online;
  • Allowing state attorneys general to enforce federal civil laws against online platforms; and
  • Requiring the Government Accountability Office to study and report on the viability of an FTC-administered whistleblower program for employees or contractors of online platforms.