“The 360” shows you diverse perspectives on the day’s top stories and debates.
The Supreme Court this week announced it will take up a pair of cases that could fundamentally change the legal foundations of the internet.
Both cases ask the justices to consider how far protections that shield websites and social media companies from legal liability over what users post to their platforms should go. Those protections were created in a portion of the Communications Decency Act of 1996 known as Section 230 — a provision that has been called “the twenty-six words that created the internet.” Section 230 did two crucial things. It established that companies operating websites or social media platforms could not be held legally responsible if their users post content that breaks the law. It also granted them the right to curate, edit and delete user content as they see fit.
For the past 26 years, Section 230 has undergirded nearly everything about how the internet functions. Experts widely agree that Big Tech giants like Google, Facebook, Twitter would not exist in their current forms without the legal armor they receive from Section 230.
In recent years, however, Section 230 has become the target of intense criticism from members of both political parties, though for different reasons. Many Republicans say it allows Big Tech companies to suppress conservative viewpoints and censor prominent voices on the right — most notably former President Donald Trump, who is currently banned from Facebook and Twitter. Democrats and some activists on the left argue that Section 230 means social media companies don’t face any consequences when they allow misinformation, violent rhetoric and harassment to exist on their platforms. A number of bills have been put forward to amend Section 230, but the political divide over what the solution should look like has meant none of these bills have come close to passing.
Neither of the two cases going to the Supreme Court fits into one side of this partisan debate. Both concern lawsuits brought by family members of people who were killed in terrorist attacks who believe tech companies — Google in one case, Twitter in the other — should be held responsible for failing to stop extremist groups from operating on their platforms.
Why there’s debate
For all the complaints about Section 230, there’s still a lot of concern about how a ruling that significantly alters, or even eliminates, its protections would change the online world we’re so dependent on today.
Many communications law experts fear that a decision throwing out Section 230 would create chaos in one of the world’s most important industries, as companies attempt to quickly react to a sudden and drastic change in the legal landscape. They argue that, because few companies would be able to endure the new financial risk of lawsuits over user-generated posts, venues for free speech online would rapidly erode or even disappear. Other sites might go the opposite direction and eschew moderation altogether, which would create space for their platforms to turn into cesspools of objectionable content.
Some experts say that even smaller changes could disrupt the countless algorithms and automated systems that allow much of the internet to function effectively.
But others argue it’s long past time for new laws to govern online speech, and with Congress unable to pass anything sensible the courts provide the best opportunity to create them. Some conservatives are hopeful that the upcoming ruling might make Big Tech less willing to censor right-wing content. Some legal scholars say that the risk of a ruling that would “break the internet” are overblown. It’s much more likely, they argue, that the court will issue a decision that narrowly alters the law in ways that force companies to accept more accountability for things like content recommendations, promotions and search results but leaves them otherwise protected against users’ misdeeds.
Rulings in these two cases, which are expected next year, may not be the only time the court weighs in on Section 230 this term. The justices have also been asked to consider cases regarding recently passed laws in Texas and Florida that bar social media companies from removing posts based on political ideology.
The fundamental structure of the internet could change dramatically
“Its rulings could be the start of a new reality on the internet, one where platforms are much more cautious about the content they decide to push out to billions of people each day. Alternatively, the court could also create a situation in which tech companies have little power to moderate what users post, rolling back years of efforts to limit the reach of misinformation, abuse and hate speech. The result could make parts of the internet unrecognizable, as certain voices get louder or quieter and information spreads in different ways.” — David Ingram, NBC News
Even a narrow ruling would scramble online life
“It’s not that the Supreme Court is expected to issue an opinion saying ‘platforms are fully liable for everything on them, immediately and irrevocably’ or something like that. Little changes make a big difference, and if the court simply ruled that Section 230 did not protect Google in this case, every lawyer in the country would be rushing to apply that new definition of the law to policies, behaviors, features, everything.” — Devin Coldewey, TechCrunch
Subtle changes are needed to update laws to fit the modern internet
“What we’re trying to do is balance the needs of platforms to not be held immediately responsible for what gets posted to their platform and also to provide platforms with the proper incentives to police their site against known harmful content. The question is whether we got that balance right in 1996, and I think you could make a very good argument that we might want to rebalance that.” — Michael Smith, information technology researcher, to CNBC
The average user isn’t likely to see much change
“I don't think it would change much, actually. Platforms already have tremendous ability to control how content is promoted. They will have to make wiser decisions and be held accountable for those decisions.” — Adam Candeub, communications law expert, to ABC News
Current laws give Big Tech far too much leeway to censor conservative views
“The censorship of conservative voices by Big Tech is well-documented. Some on Capitol Hill have gone so far as to claim that Big Tech practically ‘owns’ the government — a claim that seems increasingly well-founded considering the government’s documented efforts to control messaging during both the pandemic, and on politically inconvenient stories.” — Sarah Parshall Perry, Washington Examiner
The courts may soon render the internet completely unworkable
“It is entirely possible that next year the Supreme Court may rule that (1) websites are liable for failing to remove certain content (in these two cases) and (2) websites can be forced to carry all content. It’ll be a blast figuring out how to make all that work. Though, some of us will probably have to do that figuring out off the internet, since it’s not clear how the internet will actually work at that point.” — Mike Masnick, Techdirt
Conservatives would face the biggest consequences if Section 230 were thrown out
“If a company like Twitter suddenly finds that it is held liable for each post on its site, the company says that its options would become limited to either folding entirely or conducting extreme amounts of vetting and content moderation, much more than already goes on. This, of course, isn’t exactly what conservatives want.” — Kyle Barr, Gizmodo
Everyone will benefit from having clearly defined rules about what tech companies can and can’t do
“The two cases, taken together, will give the Court an opportunity to clarify the ground rules for when the platforms can be sued for doing too little to censor content, or too much to promote it.” — Dan McLaughlin, National Review
Congress's failure to pass sensible reforms has left the fate of the internet in the hands of the Supreme Court
“If the United States had a more dynamic Congress, lawmakers could study the question of how to maintain the economic and social benefits of online algorithms, while preventing them from serving up ISIS recruitment videos and racist conspiracies, and potentially write a law that strikes the appropriate balance. But litigants go to court with the laws we have, not the laws we might want.” — Ian Millhiser, Vox
Is there a topic you’d like to see covered in “The 360”? Send your suggestions to firstname.lastname@example.org.
Photo illustration: Yahoo News; photos: Getty Images