The U.S. Supreme Court weighs in on social media companies banning users for political speech

The Supreme Court is seen at sundown in Washington, on Nov. 6, 2020.
The Supreme Court is seen at sundown in Washington, on Nov. 6, 2020. | J. Scott Applewhite, Associated Press
  • Oops!
    Something went wrong.
    Please try again later.

The U.S. Supreme Court heard arguments Monday to determine if states can prohibit social media companies from banning users based on political speech if that speech violates the platform’s policies.

The justices are grappling with how the First Amendment should be applied on social media platforms based on two cases: Moody v. NetChoice in Florida and NetChoice v. Paxton based out of Texas.

The Florida law would stop social media companies from banning the accounts of political candidates and media publications, while the Texas law says platforms can’t moderate content based on a person’s point of view.

The justices are expected to hand down a decision on both cases this spring. The decision could have implications for further legislation around social media companies.

Henry Whitaker, Florida’s solicitor general, argued the courts should treat these companies like phone carriers.

Whitaker said “the design of the First Amendment is to prevent the suppression of speech, not to enable it. That is why the telephone company and the delivery service have no First Amendment right to use their services as a choke point to silence those they disfavor.”

Paul Clement, an attorney representing NetChoice, a social media industry group, argued Florida’s law interfered “with editorial discretion.” Citing Miami Herald Publishing Co. v. Tornillo, a 1974 case where the court decided newspapers are not required to publish replies from candidates to editorials, Clement said, “Indeed, given the vast amount of material on the internet in general and on these websites in particular, exercising editorial discretion is absolutely necessary to make the websites useful for users and advertisers.”

One of the basic questions at hand is will the court treat social media companies more like a phone carrier or more like a newspaper?

This question has First Amendment implications — both Whitaker and Clement cited the right to free speech in their arguments.

On behalf of Florida, Whitaker argued that since social media companies transmit speech like telephone companies, they do not have a First Amendment right to censor and deplatform users inconsistently.

Clement’s argument centered on whether or not lawmakers should be able to regulate the speech of private tech companies. “If you are telling the websites that you are — that they can’t censor speakers, you can’t turn around and say you’re not regulating expressive activity,” he said.

Aaron Nielson, Texas solicitor general and BYU law professor, argued on behalf of Texas in Paxton v. NetChoice. He based his argument on the idea that social media platforms are the online public square.

“Everyone is online. The modern public square,” Nielson said. “Yet, if platforms that passively host the speech of billions of people are themselves the speakers and can discriminate, there will be no public square to speak of.”

Nielson referenced a brief filed by Lawrence Lessig, Zephyr Teachout and Tim Wu who said, “not just states, but Congress may be powerless to address the social media crisis devastating the lives of kids” if Texas’ law is struck down.

“HB20 is a modest effort to regulate such power in the context of viewpoint discrimination. Platforms can say anything they want under HB20 about anything. There’s no limit. They can say anything they want. Users can block anything they don’t want. There’s no limit on that,” Nielson continued.

Clement argued it is “absolutely vital” for editors and speakers to have the ability to “engage in viewpoint discrimination, that is their First Amendment right.”

“It is also absolutely vital to the operation of these websites because, if you have to be viewpoint-neutral, that means that if you have materials that are involved in suicide prevention, you also have to have materials that advocate suicide promotion.” This type of viewpoint-neutrality is unattractive to users and advertisers on the platform, Clement said.

What the Supreme Court justices said

Some justices said the First Amendment applies to the government, not private companies.

“The First Amendment doesn’t apply to them,” Chief Justice John Roberts said during arguments for NetChoice v. Paxton. “The First Amendment restricts what the government can do.”

Justice Brett Kavanaugh said, “When a private individual or private entity makes decisions about what to include and what to exclude, that’s protected generally editorial discretion, even though you could view the private entity’s decision to exclude something as ‘private censorship.’”

Some justices also contended that Florida’s law might be too broad.

Justice Sonia Sotomayor said, “This is so, so broad, it’s covering almost everything. But the one thing I know about the internet is that its variety — variety is infinite.” She asked at what point, if the law is so unspecific, that the state “bear(s) the burden” of delineating what cases it would cover and what cases it would not cover.

There was also concern expressed about what a decision would mean for Section 230. Historically, Section 230 has shielded social media companies from lawsuits over content moderation. Justice Amy Coney Barrett contended if the court sided with the states, then there might be implications for Section 230 as well.

“But I also think there are a bunch of land mines. And if that’s a land mine, if what we say about this is that this is speech that’s entitled to First Amendment protection, I do think then that has Section 230 implications for another case, and so it’s always tricky to write an opinion when you know there might be land mines that would affect things later,” Barrett said.

Justice Samuel Alito questioned whether or not social media companies were being consistent.

“If you were a newspaper, and you published the content that appears in every single one of the videos on YouTube that you allow to be included, you would be liable, potentially, for the content of that material,” Alito said. “And I don’t understand the rationale for 230, if it wasn’t that you can’t be held responsible for that because this is really not your message.”

“Either it’s your message or it’s not your message. I don’t understand how it can be both,” Alito said.

When NetChoice v. Paxton briefly appeared before the court in 2022, some justices expressed concern over how prior precedents should be interpreted. The court blocked Texas from enforcing its law. At that time, Alito wrote in the dissent, “It is not at all obvious how our existing precedents, which predate the age of the internet, should apply to large social media companies.”

What could this mean?

One legal scholar has argued that if the court strikes down Texas’ law, it’s possible other states’ laws around protecting teens from the harms of social media could be impacted.

“State legislatures across the country have introduced or passed bills designed to protect teenagers from the worst effects of social media,” Teachout, a self-described progressive legal scholar and activist, wrote in The Atlantic. “Many of them would regulate content moderation directly. Some would require platforms to mitigate harms to children; others would prohibit them from using algorithms to recommend content.”

If the court strikes down Texas’ law, it may be easier to make the argument to block these other kinds of social media laws.

“The laws in question are bad, and if upheld, will have bad consequences,” Teachout, one of the professors who submitted a brief in favor of Texas, wrote. “But a broad constitutional ruling against them — a ruling that holds that the government cannot prohibit dominant platforms from unfairly discriminating against certain users — would be even worse.”

Utah Attorney General Sean Reyes joined a coalition of attorneys general from 19 other states, including Arizona and Alabama, on an amicus brief to the Supreme Court in Moody v. NetChoice. The brief argues states do “have authority to prohibit mass communication platforms from censoring speech.”

“The States have a vital interest in hearing the speech of their citizens, especially political speech,” the brief says. “That is necessary for states to be democratically responsive.”

Another brief filed by a bipartisan group of 22 attorneys general says, “Among other things, states have a compelling interest in protecting minors from the risks of social media, including social media’s negative effect on minors’ psychological well-being and the danger that minors will fall victim to scams and other hazards on social media.”

The brief later states, “Instead of adopting NetChoice’s approach, which would effectively immunize platforms from regulation, the Court should carefully consider the unique regulatory challenge posed by social media platforms and adopt a nuanced approach that acknowledges that there is no ‘one size fits all’ First Amendment analysis of state regulation of social media platforms.”

It’s not just states’ laws that could be impacted. “The nonprofit that runs Wikipedia and individual Reddit moderators have worried that they might need to fundamentally change how they operate or face new legal threats,” Lauren Feiner reported for The Verge. “More traditional publishers have warned that a ruling in the states’ favor could undercut their First Amendment rights as well.”

What are the Florida and Texas social media laws?

Florida and Texas passed two different laws in 2021 — both were about the way social media platforms regulate political speech.

“This session, we took action to ensure that ‘We the People’ — real Floridians across the Sunshine State — are guaranteed protection against the Silicon Valley elites. ... If Big Tech censors enforce rules inconsistently, to discriminate in favor of the dominant Silicon Valley ideology, they will now be held accountable,” Gov. Ron DeSantis said when signing the bill.

Florida’s law prohibits companies from suspending or banning political candidates’ accounts. It also establishes a path to a private right of action for users who assert their accounts were unfairly banned.

Texas’s law would make it illegal for social media platforms to “block, ban, remove, deplatform, demoneitze, de-boost, restrict, deny equal access or visibility to” a user based on that person’s expression of their viewpoint.

Industry groups NetChoice and the Computer and Communications Industry Association sued Florida and Texas after the laws were passed.