Americans Have Always Felt Guilty About TV Watching

With Thanksgiving, the holidays have begun in earnest, ushering in an uneasy season of guilt borne partly of over-consumption: too much shopping, too much pie (if such a thing even exists), and too much entertainment.

It’s difficult to overstate how bloated television gets from Turkey Day onward. On top of the Macy’s Day Parade, broadcast networks are serving up the obligatory Thanksgiving specials, all-day football, 12-hour marathons of NCIS and House, Sleepless in Seattle re-runs ad nauseam, the all-you-can-watch episodes of Pitbulls and Parolees that no one asked for …

In 1996, the British Film Institute’s Audience Tracking Study asked subjects to comment on any feelings of guilt while watching TV—their responses may shed light on the attitudes of viewers across the Atlantic Ocean. People said they felt bad for the following reasons: wasting time, watching daytime TV, neglecting other tasks, having trashy program tastes, and for forcing one’s show choices on others.


Related Story

When, Exactly, Does Watching a Lot of Netflix Become a 'Binge'?


Television’s association with guilt has persisted since the medium’s early days, but the nature and tenor of that guilt has evolved under the influence of several factors, including demographics, time period, technology, program type, and intellectual stimulation.

But first, a quick background: In 1938, television was still "making strides" and "experimental," but the following year, the first-ever TV transmission took place. During World War II, most broadcasting in the U.S. came to a halt before returning with full vigor. The 1950s saw the rise of color TV sets, along with hit shows such as I Love Lucy, The Twilight Zone, Leave It to Beaver, and Bonanza.

Then in 1961, Federal Communications Commission chairman Newton Minow famously called out television as a “vast wasteland,” filled with formulaic programs (“comedies about totally unbelievable families, blood and thunder, mayhem, violence”), awful commercials, and “most of all, boredom.” The bureaucrat aimed his comments at the quality of available programming at the time (“when television is bad, nothing is worse”). But a show’s perceived pedigree isn’t the only element that shapes guilt.

The television historian and former network executive Tim Brooks has been studying the issue of television and guilt for decades. When Brooks conducted surveys and spoke with focus groups in the 1970s, he found that participants would consistently underestimate how much TV they watched. They also tended to overestimate how much educational programming like PBS they viewed.

It’s difficult to overstate how bloated television gets from Turkey Day onward.

"Generations in the 70s remembered when TV had begun and how it took over their lives," Brooks said. But after them came a cohort of viewers for whom television always existed, even if it changed. From the 70s to the 90s, TV sets got bigger, allowing people to sit farther away (also, enter the remote control). The number of channels slowly crept upward. TVs got cheaper, and by the mid-90s, roughly 70 percent of U.S. households had at least two sets, up from 2 percent in the mid-50s. VCRs allowed for the recording of multiple shows, and TV-watching as a solo endeavor emerged.

In the late 1980s, research began to indicate the existence of consumer guilt and its useful role in capitalism (terms like “guilt market” were coined). Networks recognized their stake in easing their viewers’s collective guilt and adapted accordingly. According to Brooks, viewers tend to feel better about watching TV if they can feel there’s a mentally stimulating component—hence the eventual meteoric rise of franchises like Law & Order and CSI, with their seemingly infinite capacity to generate spinoffs.

“The police procedural mixes science with crime-solving, so you get the police lab and how they figure out the hairs and DNA or the little clues—and all of that gives it a patina of science,” Brooks said. “And so the viewer thinks, ‘Maybe this isn’t a waste of time; I’m learning something from this.’”

In the 1990s and early aughts, a surge of channels like Lifetime that catered to a particular demographic meant viewers no longer had to worry about wasting time on material they didn’t care about. People also began to identify more with their favorite shows, Brooks said. This idea that “You Are What You Watch,” accordingly pays off for advertisers.

Some things haven't changed from the 70s. A few years ago, a study by the Council for Research Excellence found that subjects still underestimated how much traditional TV they watched. But a new finding came to light: Participants also overestimated how much time they spent on their mobile phones, a huge and new source for streaming shows and videos.


Related Story

A Vaster Wasteland


How much does digital technology have to do with TV guilt today? When homes had just one TV set with a limited number of channels, didn’t sitting down at night for a show with the family mean connectedness and good feelings all around? Contrast that with today, with laptops, tablets, and smartphones feeding binge-watching and solitary viewing—both of which must be guilt-inducing. Right?

Yes and no, Brooks explained. On one hand, mobile streaming—usually through YouTube or services like Netflix—has allowed for the further decentralization of the viewing experience from the living room to, say, the doctor’s waiting room. But technology has also helped TV-watching habits come full-circle, back to their communal roots. Most mobile viewing, surprisingly, takes place at home, meaning that a kid could be on her Android, watching a web series, while her parents watch CSI 10 feet away. In other words, digital technology has also engendered a new kind of togetherness.

On the demographic side, younger viewers in college and high school who have fewer responsibilities are less inclined to begrudge themselves of a couple hours of Parks and Recreation. But research indicates that once the responsibilities begin to pile up—marriage, work, kids—the burden of justification for sneaking in an hour-long episode of Mad Men grows that much heavier.

More so than with other forms of entertainment, indulgence in TV is readily pathologized.

This guilt, warranted or not, has consequences. TV, alongside its similarly maligned cousin video games, can have restorative, even salubrious effects, but mostly for those who already view it as a healthy way to unwind at the end of a busy day, according to a June study in the Journal of Communication. Otherwise, people feel like failures, unable to control their own TV consumption.

More so than with other forms of entertainment, indulgence in TV is readily pathologized. For decades, researchers have noted the hazards linked to television: Depictions of guns and blood spatter can cause aggressive behavior; too much Nickelodeon can fatten up and dumb down children (even if it’s in the background); too much time in front of the set will numb your brain; you’ll get addicted. Television will make your eyes square; kids might forget what libraries are.

But TV shows have long since internalized, even embraced, the casual, long-simmering suspicion of the medium. Broadly meta TV abounds (see here, here, and here). TV openly disses itself, knowingly, winkingly, ironically.

Whatever guilt we dedicated viewers feel from time to time can be transmuted, or neutralized, by television itself. The show will, as they say, go on. Chances are if we’re still watching, we’ll keep watching.

This article was originally published at http://www.theatlantic.com/entertainment/archive/2014/11/americans-have-always-felt-guilty-about-tv-watching/383152/