Twitter, Google win big at Supreme Court

  • Oops!
    Something went wrong.
    Please try again later.

The Supreme Court has passed up a closely watched opportunity to clarify the scope of the federal liability shield known as Section 230 that protects internet companies from most legal claims over content posted by users.

In a pair of rulings Thursday morning, the justices rejected lawsuits seeking to hold tech giants like Google and Twitter liable for terrorism-promoting content on their platforms. And the court nixed the suits without issuing any sweeping pronouncements on the immunity provision that has come under increasing fire from Republicans and Democrats.

The cases mark the first time the high court dealt with Section 230 of the Communications Decency Act, the 1996 law that broadly protects tech companies from being sued over hosting most third-party content on their websites and decisions to remove violative material.

The two decisions mark a major win for the tech industry, which has argued that narrowing Section 230 could be disastrous for the internet if platforms could be sued over content-moderation decisions. But the resolution leaves the door open to future showdowns —- potentially in Congress — over the breadth of the legal protection the internet firms enjoy.

The tech industry celebrated the rulings. “This is a huge win for free speech on the internet,” Chris Marchese, the litigation center director for NetChoice, a tech trade group representing Twitter and Google, said in a statement.

“Even with the best moderation systems available, a service like Twitter alone cannot screen every single piece of user-generated content with 100% accuracy,” Marchese said. “Imposing liability on such services for harmful content that unintentionally falls through the crack would have disincentivized them from hosting any user generated content.”

In the first case, Twitter v. Taamneh, the Supreme Court unanimously rejected a lawsuit seeking to hold Twitter, Google and Facebook responsible for an ISIS nightclub attack in Turkey in 2017 due to recruiting videos posted on their sites.

Then, the justices used the ruling in that case to wriggle out of a clear-cut decision in Gonzalez v. Google, a lawsuit from the family of a California college student who was killed in a 2015 terrorist attack in Paris. The family alleged that Google’s YouTube algorithms promoted ISIS recruitment videos and thereby contributed to the attack.

The high court disposed of the Google case in a three-page, unsigned opinion that said the Section 230 issues were not ripe for decision.

“We … decline to address the application of §230 to a complaint that appears to state little, if any, plausible claim for relief,” the court’s opinion said.

While the justices sent the Google suit back to the 9th Circuit Court of Appeals for further consideration and left open the theoretical possibility the case could be refiled, the high court’s dismissive language offered the plaintiffs little hope.

Apart from the Section 230 issues, the high court’s emphatic rejection of the suit over the Turkey nightclub attack represented a major victory for tech firms who feared that courts or juries might impose massive liability on the companies by concluding that their systems for removing terrorist propaganda were insufficiently robust.

In an opinion written by Justice Clarence Thomas, the court said those sorts of claims of insufficient vetting or policing of a platform were not enough to make the companies liable under the Justice Against Sponsors of Terrorism Act, a law Congress enacted in 2016 over President Barack Obama’s veto.

Thomas said the concept of aiding and abetting terrorism wasn’t always easy to define, but imperfect efforts to get terrorist content off a site were not the same as assisting in a terrorist act

“The point of aiding and abetting is to impose liability on those who consciously and culpably participated in the tort at issue,” Thomas wrote. “When there is a direct nexus between the defendant’s acts and the tort, courts may more easily infer such culpable assistance. But, the more attenuated the nexus, the more courts should demand that plaintiffs show culpable participation through intentional aid that substantially furthered the tort.”

Thomas acknowledged that there few applicable “crisp, bright-line distinctions” to identify aiding and abetting, but he expressed discomfort with the suit’s theory that could expose social media companies to liability for “every single terrorist act committed by ISIS.”

The decision from Thomas boosting freewheeling interaction on social media despite potential abuses by bad actors was somewhat of a pivot from his previous stances on Section 230 — where he’d written two dissenting opinions calling for the court to review and narrow the legal shield, questioning what he viewed as lower courts’ overly broad interpretations of the law in favor of tech companies.