The Supreme Court Will Decide if Texas Is Allowed to Kill the Internet

A statue sitting in front of the Supreme Court building speaking the "X" logo.
Giving Texas what it wants may spell the beginning of the end of the internet as we currently know it. Photo illustration by Slate. Photo by Getty Images Plus.

This is part of Opening Arguments, Slate’s coverage of the start of the latest Supreme Court term. We’re working to change the way the media covers the Supreme Court. Support our work when you join Slate Plus.

When social media platforms like Facebook and YouTube moderate content, are they engaged in protected speech? Or are they engaged in an invidious form of censorship? The answer, which lies at the heart of a pair of cases the Supreme Court agreed to hear on Friday, could fundamentally alter the nature and operation of social media platforms and the internet itself.

Reacting to complaints from the political right that large social media platforms including Facebook and YouTube actively censor conservative views, Texas and Florida enacted laws prohibiting the platforms from removing, deleting, or deplatforming speech or speakers based on viewpoint. The laws differ in some respects, but both create a legal cause of action against social media platforms that engage in any of the laws’ defined methods of “censorship.” They also require that platforms provide an explanation for any posts “censored” and publicly disclose their guidelines for removing speech or speakers from the platforms.

The U.S. Court of Appeals for the 11th Circuit enjoined the Florida law, concluding it violated the First Amendment rights of the platforms to determine what content to display and which users to ban or temporarily exclude. The U.S. Court of Appeals for the 5th Circuit came to the opposite conclusion regarding the Texas law, repeatedly characterizing social media content moderation as “censorship” and finding the platforms have no First Amendment “right to muzzle speech.”

When it comes to newer media, courts and lawyers often struggle to fit contemporary problems into preexisting First Amendment decisions and doctrines. The briefs and arguments in the cases will lean heavily on analogies from prior Supreme Court precedents. For example, the platforms will argue they are like newspapers, which the court has held have an established First Amendment right to engage in editorial judgment when deciding what content to publish. The states will counter that unlike newspapers, the platforms review almost none of what they allow users to post, either before or after publication. The states will argue the platforms are more like large public malls, which the court has held can be required by law to host some expressive activity. The platforms will respond they are like parade organizers, which the court has held have a First Amendment right to determine who marches in their inherently expressive events. Judge Andrew Oldham concluded in his 5th Circuit opinion that the platforms are more like “common carriers,” including electricity providers and trucking companies, which are prohibited from denying service based on the user’s viewpoints.

If you think none of these examples fits perfectly, you are in good company. As Judge Leslie Southwick wrote in a separate opinion in the 5th Circuit case: “We are in a new arena, a very extensive one, for speakers and for those who would moderate their speech. None of the precedents fit seamlessly.”

So much is at stake in these cases—for the platforms, their users, and the public. Platforms require members of their communities to accept terms of service that include, among other restrictions, content moderation rules. By moderating obscenity, hate speech, public health misinformation, and other content, platforms enforce specific site-based community standards and define online communities. They post disclaimers to certain posts and publish their own content. The platforms also respond to threats—to individual users, the online community, and the public.

The Texas and Florida laws would substantially undermine these prerogatives. If the 5th Circuit is correct, platforms that allow user posts or videos that are anti–white supremacy, anti-misogyny, and anti–domestic terrorism would be legally compelled to provide space for pro–white supremacy, pro-misogyny, and pro–domestic terrorism speech. Efforts to combat disinformation and misinformation, whether about elections, public health, or other subjects, would also in many cases lead to legal jeopardy for the platforms—or mire them in onerous lawsuits filed by disgruntled users who insist on the right to a platform for their speech. Governments could also chill the platforms’ right to host content they actively support—out of fear they will have to allow its antithesis.

Judge Southwick is correct that there is no perfect analogy. But as he concluded, the platforms do engage in editorial functions when they curate and collate content. As the 11th Circuit observed, the platforms aren’t just “dumb pipes.” They exercise editorial judgment over what content users see when they visit the site. The fact that they do not edit in the same manner as newspapers, which among other things have only so many columns to fill, should not be considered dispositive. The fundamental point is that they edit, or moderate, content.

Critically, a Supreme Court decision upholding these social media laws would be contrary to several significant First Amendment trends—all initiated and embraced by conservative justices. First, the court has recognized and protected corporate expression in the election and other regulatory environments. Consider, for example, Citizens United, which protected corporate electioneering. If Mark Zuckerberg has the right to donate unlimited amounts of his own money to a super PAC backing a candidate he supports, then the platforms he owns the majority stake in should be able to decide what appears on them. Second, the court has been keen to protect the rights of speakers to exclude or refuse service to those with whom they disagree or do not want to associate. In fact, just last term, the court held that a website designer could not be compelled to design a custom wedding website for gay customers, notwithstanding laws that forbid discrimination based on sexual orientation. The court has also upheld the rights of parade organizers, the Boy Scouts, and other speakers to exclude speakers and speech with which they disagreed. Third, the court has characterized the internet as a “vast public library” and social media platforms as “the new public square.” Its decisions have warned lawmakers and regulators to tread very lightly, lest they chill expression and interfere with the development of a robust cyber-marketplace of ideas.

Giving governments the power to compel large social media platforms to host all manner of speakers and speech offends well-established First Amendment principles. It may also spell the beginning of the end of the internet as we currently know it. Right now, platforms can take down vile and harmful content when it offends their terms of service. But if the Texas and Florida laws stand, the platforms would become a virtual free-for-all. White supremacists, terrorists, and other harmful speakers would gain a legal right to communicate on the platforms. These and other speakers could effectively shut down the platforms by forcing them to defend countless lawsuits under the state laws.

It can be hard to muster sympathy for social media platforms and their principals, who have made inconsistent statements about their relationship to user content and have not always moderated responsibly. But the alternative offered by Texas and Florida—robbing the platforms of their editorial power—threatens mischief all out of proportion to the supposed evil those states have identified. The First Amendment does not allow government to ban private speakers from deciding what messages to disseminate or to level the playing field against what Florida Governor DeSantis has referred to as “Silicon Valley elites.”

Hopefully, the Supreme Court will accept this reality, as well as enforce its own precedents.