Consumer Reports has no financial relationship with advertisers on this site.
Advertisers can use Facebook and Instagram to target ads at teenagers who the social media giant thinks are interested in alcohol, tobacco, and extreme weight loss, two new studies have found.
To test Facebook’s system for targeting consumers with online ads, the studies’ authors created Facebook ads featuring cocktail recipes, dating app offers, and weight-loss “tips” that promoted an eating disorder.
The researchers used Facebook’s tools to target the ads at users between 13 and 17 years old who were categorized as being interested in those topics.
Facebook gave the green light for the all those ads, plus several more that promoted gambling and other adult activities, even though the company officially prohibits targeting teens with such ads.
None of the ads were shown to the public. The two groups—an American nonprofit research organization called Tech Transparency Project and Reset Australia, a technology watchdog—canceled the advertisements after they were approved but before they appeared in Facebook users’ feeds.
“Facebook—or any company—should not be allowing this sort of advertising to teenagers,” says Ariel Fox Johnson, senior counsel for global policy at Common Sense Media, a research and advocacy organization that studies how children use media and technology.
“Teens are more susceptible to targeted ad techniques than adults,” Johnson says. “And all the problems with advertising to teens are exacerbated when you’re targeting teens based on their unique insecurities and problems.”
Aiming Ads at Teens
These rule-breaking ads illustrate one of the enduring challenges of maintaining an enormous social media platform: Even if you write sensible-sounding policies—like Facebook’s bans on advertising alcohol or weight-loss products to teens—those rules are just words on a screen unless they’re strictly enforced. And researchers and journalists have repeatedly found that Facebook’s filters can allow harmful content to slip through onto its platforms.
Keeping posts that break the rules off a platform like Facebook isn’t always easy. Hundreds of millions of Facebook users are constantly bombarding the site with text updates, photos, and videos. Facebook says that it takes down the lion’s share of harmful posts using artificial intelligence tools and a 35,000-person workforce that combs the site for violations—but that it can’t remove every policy-violating post.
In this case, however, the ads weren’t full of slippery innuendo and cheeky code words. Facebook’s tools for targeting ads appear to allow advertisers to openly direct inappropriate ads at 13- to 17-year-olds who might be interested in dangerous topics. Facebook does not allow kids under 13 to use its products.
The Tech Transparency Project designed an ad that offered teens “pro-ana” tips—that’s common slang for pro-anorexia—on a background image of a woman’s narrow waist. The ad was pitched at teens whom Facebook has identified as interested in weight loss.
One of Reset’s ads encouraged users to “TRY YOUR LUCK” and “WIN PRIZES!” alongside icons of dice and a poker chip, and was targeted at teens who Facebook thinks might be interested in gambling. Both groups also created ads featuring cocktails and aimed them at teens Facebook had tagged as interested in alcohol.
These “interests” are among the hundreds of categories that Facebook uses to sort its users into groups, based on their activity both on and off Facebook and Instagram, which the company has owned since 2012. Advertisers pay to access these categories of users to get their pitches in front of specific audiences.
The platform gives advertisers an estimate of how many people they can access in each group. The Tech Transparency Project found that Facebook shows a “potential reach” of 910,000 teens under 17 interested in “alcoholic beverages,” 300,000 teens under 17 interested in “gambling,” and 140,000 teens under 17 interested in “weight loss.”
“That’s a flaw right off the bat,” says Katie Paul, executive director of the Tech Transparency Project. “Facebook shouldn’t even allow you to pick those interests when you’re targeting kids under 18.”
A Facebook spokesperson tells Consumer Reports that the company is “investigating why some of these violating ads were not detected.”
“We prohibit ads about alcohol, weight-loss products, and certain other topics from being shown to people under the age of 18, and we have age restriction tools so that businesses can better control who sees their content,” the spokesperson says. “We also may re-review ads after they are live.”
All six of the ads that Paul scheduled were approved within half a day. Some of Reset’s ads, including two that showed young women exhaling smoke, were rejected, but Facebook accepted a version of the ad with a young woman holding an e-cigarette. A badge on the image said, “Cool girl.”
Dylan Williams, who leads advocacy campaigns at Reset Australia, says he was taken aback that so many of the ads his group created got through the filter. “We were really trying our hardest to get the ads rejected,” he says.
Avoiding Targeted Advertising
Both groups’ conclusions square with the results of an experiment that Consumer Reports performed last year. We set up seven paid ads with a range of COVID-related falsehoods, including one that encouraged users to drink bleach. The ads, which violated Facebook’s policies against coronavirus misinformation, sailed through the approval process. CR canceled the ads before they were shown to Facebook users.
“I am appalled that Facebook still hasn’t fixed its ad-approval process, a year after your original experiment,” says Nathalie Maréchal, a researcher at Ranking Digital Rights, a nonprofit that grades tech companies on factors including their privacy and content moderation practices. “Enforcing its own rules for advertising is the bare minimum Facebook should be doing: They write those rules, the volume of ads is much smaller than user content, and they don’t have to weigh the same kinds of free-expression issues for ads as they do for user content.”
Teens aren’t just the unwitting subjects of targeted advertising—many say they’re also creeped out by it. When Reset surveyed 400 16- and 17-year-olds about data collection and profiling last month, almost 79 percent of them said they were concerned about Facebook’s data gathering practices. They were particularly worried about being pegged for an interest in weight loss, gambling, or cigarettes.
What can you do to keep unwanted ads out of your digital diet? It’s hard to avoid targeted advertising online—these ads are the financial engine of some of the biggest tech companies, including Facebook and Google. But there are a few ways to make it less personal and intrusive.
Facebook’s ad preferences allow you to see what information about you is made available to advertisers, and the interest categories that Facebook has assigned to you. You can remove the categories individually if you don’t want advertisers to use them. You can also tell Facebook to stop using information about you that it gathered from other companies in order to target ads. However, you’ll have to follow separate instructions to tell Instagram to do the same thing.
If you come across a Facebook ad that shouldn’t be there—if it appears to be targeting teens with alcohol ads, for example—you can report it to Facebook.
You can also try to reduce the amount of targeted advertising you see elsewhere on the internet. Google allows you to turn off personalized ads, and a new feature on Apple devices like iPhones and iPads allows users to tell apps to stop tracking them.
But Johnson at Common Sense says that teens shouldn’t have to jump through such hoops to avoid targeted advertising—they should be off-limits entirely.
“We do not consider it in the best interest of children to show them ads based on profiling,” she says. Some countries, like the U.K. and Ireland, have recently proposed or published guidelines that discourage targeting advertising toward kids. “Hopefully, companies are going to start to listen.”