The Move to Reclaim Privacy

It at least can feel like you’re being spied on by Silicon Valley. Sometimes the ads that are served to us on our phones have a spooky quality that makes it seem like we are being tracked and even heard by advertisers, though the latter is hotly denied and may be infeasible. This weekend, my wife, the kids, and I spent time at my in-laws’ for Father’s Day celebrations. My children played on a little toy roller coaster for outdoors that their grandmother had bought for them. The next day, Amazon advertised the same product to us. We know Amazon can see the connection between my in-laws’ household and our own. It knows we have kids who are the right age to play. But was it actually serving us this ad based on a good guess from our location data and my in-laws’ purchase history that we might have enjoyed this toy?

Late on Sunday night, we were drinking a bottle of Argentine wine. For some reason, it made me think of an Australian wine, and I asked my wife if she would like to go back and live there as she did for just three months in 2006. Even though she lived then in the central business district of Sidney, she said she would prefer Melbourne. Within an hour, Facebook showed her a viral article about how Melbourne is the happiest city. Very likely it was a coincidence, but it didn’t feel that way.

It doesn’t feel like a coincidence because we know that Facebook, Google, and other tech companies do things that are almost more insidious than responding to what we say in private. The scope of their surveillance is actually difficult to describe, but the tech proprietor Maciej Ceglowski tried, in an important essay on the loss of “ambient privacy” in the world Silicon Valley is creating. He notes that the calls for regulation coming from the CEOs of Google and Facebook reflect their self-interest. They are happy to be told by the government how to protect the “privacy” of data generated by their surveillance, so long as they can profit from it. Ultimately their interest is in seeing the creation of a “world with no ambient privacy and strong data protections.”

That is, Facebook and Google will still catalogue and analyze your behavior all across the Internet in ways that you barely realize. The regulation they desire is merely a way of removing responsibility from them for their decisions and imposing huge overhead costs on potential competitors. Most forms of “consent” to this surveillance turn out to be superficial. The amount of data Google and Facebook collect about the world sheds enough light for them to make accurate predictions about what exists in the spaces filled by non-users.

The tech giants know what sites you visit, what images you linger on, the identity of your friends and family. They can infer even secrets about your desires that you hardly admit to yourself. Facebook buys publicly available data sets to improve the value of their own data. It creates shadow profiles, compiling dossiers of information on people who have never created a profile on Facebook. Amazon was able to correctly guess and disqualify reviews of my book from people who attend the same parish that I do. If you described the surveillance and data-analytics capacity of Google or Facebook and attributed it to a foreign intelligence service, everyone would immediately recognize it as a dire threat to national security. But then you have to ask yourself: Is it the owner of the data that makes it dangerous or the existence of such a trove that is dangerous? Data leaks by Equifax and Facebook can be massively damaging.

There are lots of questions that politicians are not sophisticated enough to ask of the social-media giants. So far, the forays of Facebook and Google into political lobbying and public relations are rather crude. But, after the massive amount of criticism social media got for happening to be powerful media of communication for older voters when older voters endorsed Brexit and Donald Trump, has Facebook changed its ways? Obviously social-media companies are doing more politically motivated banning and censoring. But have they tried anything more sophisticated?

Does Facebook know how to change its newsfeed algorithm to demoralize one set of voters and motivate another? How would it use this power? Can Facebook’s algorithm predict a user’s slide into depression? How much money does Facebook make advertising depression remedies to users of Facebook? How does that compare to their investment in making Facebook itself addictive? Can Facebook, Google, or Amazon nudge people into or out of worldviews that are more or less profitable to Facebook, Google, or Amazon? That is, can they build algorithms that would respond to a user’s interest in asceticism or charity and slowly prod them back toward conspicuous consumption? Would they build that algorithm if they could? Would their corporate clients demand it? Would their board demand it?

Many of these questions were part of the debates during the rise of mass media. If media are going to form our citizens, should enduring public interests govern the creation of that media? Or at least part of it? But the same questions still matter to social media, even more so, because social media are now more intimate even as they are more ubiquitous. That is, because everyone’s social-media experience is individualized, the social criticism it receives is almost necessarily more diffuse. One can interrogate a director or a writer’s room or a television studio about what they want to inspire with their movies and shows, what their intended effect on our souls is. Can we interrogate programmers the same way? Perhaps the worst feature of the Internet age is that skepticism and criticism of advertising’s moral and political effect on society has gone into abeyance, even as advertising has become even more central to the business model of digital media.

Just as it would not console me that nuclear waste was deposited in the environment by a private firm, it is no consolation to me that Facebook is a privately held company. There are no real opt-outs. Facebook and Google’s snooping pixels operate across nearly all news sites, most shopping sites, and beyond. The companies are willing to copy or buy publicly available data on me even if I stopped using them, which is hard to do anyway, as these platforms are often insinuated into communities. My child’s kindergarten has a private Facebook group, and it is considered good manners to participate there.

We have been warned by Silicon Valley’s critics that if you aren’t paying for the product, you are the product. Fine. We aren’t paying money for the surveillance society that Silicon Valley has created, but what is the cost to our society at large?

More from National Review