Skip to main content
May 30, 2024

First Amendment and Internet Law Scholars Urge the Ninth Circuit to Protect the Right to Know How Social Media Companies Moderate Content

Engelberg Center

By Rebecca Delaney (‘25) and Sophie Liao (‘24)

How do social media platforms decide which posts to remove and which users to ban? How do they determine what counts as dis- or mis-information? How do platforms define “hate speech” and how much of that material gets removed? Under a new California law that recently went into effect, most social media companies that do business in the state and gross at least $100 million a year must answer these questions or face legal penalties.

One social media company says this goes too far. X, formerly known as Twitter, has challenged the law in federal court, arguing that it violates its First Amendment free speech rights. The law, AB 587, codified at California Business and Professions Codes sections 22675-22681, requires the largest social media companies to publish their terms of service and to file a publicly available report with the California Attorney General disclosing their existing content moderation policies and practices. But a district judge concluded that these transparency measures are constitutional, and so X has appealed to the Ninth Circuit.

Writing to defend platform transparency mandates like AB 587, NYU Law’s Technology Law & Policy Clinic filed an amicus brief on behalf of First Amendment and internet law scholars supporting the State of California in X Corp v. Bonta before the Ninth Circuit.

“The key objective, here, was simple: Persuade the Ninth Circuit not to make precedent that deals a blow to platform transparency legislation,” said clinic student Rebecca Delaney.

X Corp. v. Bonta may not only decide the fate of AB 587—it also has the potential to determine the survivability of future transparency measures contemplated by Congress and other states. This is because the court here is likely going to decide what level of First Amendment scrutiny should apply to these types of measures, which compel platforms to disclose information about content moderation. In their amicus brief, the First Amendment and internet law scholars stress that a lower level of scrutiny—known as Zauderer review—should apply to AB 587 and laws like it, because compelled disclosures about whether a social media company has content moderation policies and what those policies are is at most factual and uncontroversial commercial speech.

To satisfy Zauderer scrutiny, a regulation of commercial speech must be reasonably related to a substantial state interest. So the brief begins by articulating three such interests: consumer protection, democratic self-governance, and public health and safety. The scholars argue that platform transparency mandates serve these interests because they enable consumers to make informed choices in the social media marketplace, they keep citizens apprised of platforms’ impacts on democracy, public discourse, and society, and they can alert users to risks stemming from social media usage.

“These substantial state interests underlie not only AB 587 but much of the regulatory state,” said Sophie Liao, the other clinic student behind the amicus brief. “Undermining states’ abilities to protect these public interests when it comes to social media platforms would be incredibly dangerous.”