What Is Facebook?

Cards on the table: Although I use social-media products such as Facebook and Twitter to promote my work and blow off steam, I basically think these companies have a noxious effect on our society. In general I think social media has a similar but intensified degrading effect to traditional television, and for the same reason, the need to sell ads. Further, I suspect that Google and Facebook are close to having a level of power over communication that at least raises uncomfortable questions for a democratic and sovereign republic like the United States.

But even if you don’t agree with me on that, there is still a case for reexamining how these entities are treated in the law. I’m grateful Senator Josh Hawley has provoked this conversation, even if I’m not always convinced of his approach.

Right now the debate has been about the difference between platforms and publishers. National Review Online is a publication. Our comments section, though moderated, is a platform. A publication is legally liable if it prints libels or other forms of unprotected speech (a small category in America). A platform is not; only the individual posting to the platform is. (Feel free to libel me in the comments below.)

In the 1990s, Congress passed the Communications Decency Act, in part to allow Internet services such as Promenade and Prodigy (RIP) to block pornography. The idea was that open Internet platforms where smut prevailed seemed to enjoy more protection in law than those entrepreneurial outfits that wanted to create family-friendly zones. The latter, because they edited and moderated content that users posted for obscenity, were at that time treated in the law as publishers and then were suddenly liable for whatever libels and slanders users posted on them. It was a more conservative America, and Congress thought the result of this was perverse.

Most of the parts of that law that tried to directly combat the spread of pornography were struck down by the courts. But section 230 of the law, which allows proprietors of websites and forums to set standards — to edit and moderate their content without becoming a publisher of them — is now the legal remit under which social-media giants shadow-ban, block, and censor conservative speech. Many conservatives, and even my colleagues, defend this practice as part of Facebook’s free speech.

I can understand a layman who looks at this situation and starts to agree with the right-wing critique that conservatives fail to conserve. There is something frankly masochistic about conservatives cheering, “This is what freedom is all about,” as their views are expunged from the public square by the emerging monopolies dominating it, under a statute written to combat pornographers.

There are two conservative principles in tension here. On the one side, there is freedom. If you think Facebook simply is a platform — no different from a rented stage, or a web server, or a single post on an Internet forum — then the argument for the status quo is very straightforward. The user is the speaker, and if that user says something that is legally actionable, it is the user and not Facebook that is responsible. Facebook should be free to reject and moderate its content the way that forum hosts or stage companies can reject a speaker outright, but should not be in any way considered an endorser of those speech acts.

But on the other side is responsibility. Facebook may exist within existing conventions of speech law. But its revenue is generated in this strange legal zone where it is able to profit from media-like content, while disclaiming all the difficult parts of traditional-media judgment. In this way, it offends the primordial conservative conviction that right and responsibility go together. The result, I think, is calamitous to our public square.

Facebook’s explosive growth as a media company is due to several factors. The first is its genuine innovation. The kind of web forums that existed when the CDA was written tended to be governed by extremely simple algorithms. A simple chronology by date, or topic by A to Z. Facebook’s innovation was to use every new trick of web programming and ever-larger shares of data to dynamically reorder user-submitted content and transform it. That is, Facebook found a way to automate and simulate the functions associated with traditional publishing and apply these techniques to the content that users generate themselves.

Because it can collect so much data on users, and its algorithm can respond with a kind of half-baked intelligence to their behavior, Facebook’s code in many ways does the job of a much larger media oligopoly. Users are marketed to in segments, they are split up by advertisers, their data is sold away for profit, etc.

Facebook also succeeded in large part because of its legal invincibility when it comes to speech issues. It profits handsomely off the tattle, libel, and trash that its servers and algorithms circulate across the site. A normal forum such as a stage, or even a samizdat newspaper made on photocopies, would get a bad public reputation for trafficking in titillation and trash. And advertisers would flee.

If there were a real-life message board on your main street on which people signed their names to desperate pleas for attention, vague threats of suicide, or arrangements for affairs, the advertiser that sponsored it would be viewed as a vampire or social pariah. This is especially true if that message board had a movement-tracking camera that tallied your movements around town and sold that information to third parties.

But because Facebook’s content is dynamically published by algorithm to every individual user based on user behavior and the population of their friend group, it is strangely immune to the normal social criticism that would greet a traditional platform or a traditional publication, let alone a kind of spy agency. To publicly despise Facebook can feel like despising your friends, or even yourself. I make a practice of despising myself, so I find it rather easy to hate Facebook.

The result of this ability to traffic in publication-quality nonsense, and combine it with powerful tools of corporate surveillance, has contributed to the economic calamity visited upon traditional publishers. Magazines and newspapers that, for all their faults, were vital institutions in a democratic society, organizations that embodied the long judgment of their professions, were held to a higher level of responsibility by the law and society than Facebook is now A decentralized system of independent publications that produced a product for public consumption (rather than customized for individuals) allowed us to litigate ideas in public in a civil way. Everyone can judge the New York Times’s bias based on the product it puts out daily. Facebook’s biases are the subject mostly of composite rumor, of millions of people deciphering their own individualized experience together.

Social media is far more antisocial. It atomizes its users and then is able to encode its biases and enact them on the public’s expression, with none of the accountability or visibility that National Review or the New York Times works under.

Objectors to the above will respond in a few ways. First they’ll say that it would be impossible for Facebook to actually monitor its content and intelligently weed out the libel and slander that appear on its site. It would require hiring too many people.

Well, of course that’s true. Having been liberated from this responsibility, Facebook was able to contribute to the destruction of many jobs across traditional media that exercised responsibility in publishing and production. It’s not the job of the government to make sure that any possibly profitable enterprise remains that way.

Secondly, they’ll say that Facebook’s algorithms are just “free speech.” In the case of Bernstein v. Department of Justice, U.S. District Court judge Marilyn Patel found: “This court can find no meaningful difference between computer language, particularly high-level languages as defined above, and German or French. . . . Like music and mathematical equations, computer language is just that, language, and it communicates information either to a computer or to those who can read it.”

This ruling may inform jurisprudence, but it is not only an incoherent statement about computing language, it is a dangerous precedent.

It confuses the fact that code is talked about in abstractions and has rules, with it being equivalent to a language. German, French, and English are not self-executing. A whole crowd of excited people can shout “Lock her up,” but the woman in question remains free. Language is something shared between human beings with free will, and instructions and sentiments are not automatically effected in the world as if they were cast spells.

A piece of code doesn’t work the way human speech does. It doesn’t just communicate information to a computer; it can cause the computer to act, usually on more code, but often enough on code that affects the real world. A general in the Army can say, “I love Bruce Willis fans,” and he’s engaging in free expression, just the way someone writing some HTML code publishing the same statement does. But if a general commands a subordinate to torture a prisoner, it’s not free speech; he’s giving an illegal order. (Or at least, he used to be.)

I’m not sure that simply threatening to reclassify Facebook as a publisher, as Hawley’s proposed legislation would do, is the correct approach. It could, as its detractors say, lead to more stifling of speech on the platform rather than less. Though, again, this would merely be the consequence of saying that there would be more responsibility attached to speech, especially to the entity that profits the most from it.

But Facebook doesn’t seem to me much like a traditional platform as we knew them in the 1990s. Users typed into those platforms, and other users had to go specifically to that forum conversation to find them. Most of those platforms were hobbyist money-losing projects. Facebook dynamically republishes information to individual users in ways that are truly novel and truly frightening when you consider the amount of personal data that informs this publication. And it’s one of the most profitable companies in the world. It has contributed to the destruction of a valuable ecosystem of publications that embodied better judgment and more ethical behavior than Facebook. And so we should all welcome new thinking on it.

More from National Review