Report reveals major e-commerce sites profit from selling extremist merch

Close-up of black hoodie printed with QAnon messaging.
A person wears a QAnon sweatshirt during a pro-Trump rally on Oct. 3, 2020, in the borough of Staten Island in New York City. (Stephanie Keith/Getty Images)

A new report released Monday highlights how major e-commerce platforms continue to profit from selling T-shirts and other items with racist, antisemitic or otherwise offensive slogans — in many cases in direct violation of companies’ policies.

These items include T-shirts featuring antisemitic caricatures, tote bags adorned with swastikas and stickers promoting QAnon conspiracies.

The report, by the Institute for Strategic Dialogue (ISD), a London-based nonprofit that studies extremism and disinformation, examined five major e-commerce platforms — Redbubble, Etsy, Zazzle, Teepublic and Teespring — all of which generate millions of dollars each year by providing an online marketplace for independent artists and vendors.

The report’s findings, which were shared exclusively with Yahoo News ahead of publication, highlight the difficulty of enforcing content policies on platforms where hundreds of thousands of people create and upload their own designs, often with coded language and memes that might disguise offensive material.

But many of the items being sold are shockingly blatant, and some of the companies have also faced public scrutiny in the past for selling hateful merchandise. All of them have policies prohibiting the sale of harmful products, to varying degrees of specificity.

“This is, once again, a case of platforms setting guidelines and then just not enforcing them,” said Tim Squirrell, ISD’s head of communications and co-author of the report. “So much of the stuff that we found is just rooted in hate, rooted in discrimination, rooted in the vilification of various groups.”

While these websites do seem to be removing the most blatantly bigoted items — or at least making them slightly harder to find — ISD’s researchers found that “it is still extremely simple to find and purchase hateful products across the full range of these platforms.”

Screenshot of products offered on website, titled: Figure 17: Search results for 'WLM' (white lives matter)
Search results for "WLM" on Redbubble. Screenshot via ISD.

‘Egregious content’

Though ISD’s researchers found a wide range of concerning items across all five of the platforms they examined, they report that “egregious content was most readily accessible on Redbubble.”

Like most of the companies highlighted in the ISD report, Australia-based Redbubble is an online marketplace for print-on-demand products — meaning it provides an infrastructure for artists to upload and sell their own designs, which buyers can choose to print on a variety of products, such as T-shirts, stickers, mugs and tote bags.

Redbubble also has a history of controversy over offensive products that have been discovered on its site. In 2011, Jewish groups condemned Redbubble for selling products with images from a satirical webcomic called “Hipster Hitler.” A year later, the retailer came under fire yet again following the killing of Trayvon Martin in 2012 for selling a hoodie that featured a “Neighborhood Watch” sign with a warning that “We immediately murder all suspicious persons.” In 2019, Rebubble apologized for selling miniskirts and throw pillows depicting Auschwitz, the notorious Nazi death camp.

Redbubble has established a lengthy set guidelines prohibiting a wide range of content, including anything that “glorifies or trivializes violence or human suffering,” “racist content or behavior designed to incite racism,” and the “promotion of organizations, groups or people who have a history of violence and/or an agenda of hate.”

The marketplace’s guidelines also explicitly prohibit “harmful misinformation,” which it defines as “misleading or false information that harms or significantly threatens public health and safety, or where the intent is to cause fear and suspicion about a topic that can cause real-world harm.”

Enforcing these policies has proven difficult, however.

Among the most egregious items that ISD’s researchers discovered for sale on Redbubble were a tote bag containing images of a swastika and nooses alongside the words “coming cleansing” in Russian ($23.37); a “Big League Jew Bubble Gum” T-shirt ($22.66) and matching sticker ($2.57) featuring an antisemitic caricature of a hooknosed baseball player; and a “White Lives Matter” T-shirt ($23.56). There were also a whole host of items displaying images and slogans associated with the Pizzagate and QAnon conspiracy theories, as well various products promoting misinformation related to COVID-19 and vaccines.

Screenshot of products on website, with title reading: Figure 12: Search results for 'pizzagate' referencing the #SaveTheChidlren conspiracy theory
Search results on Redbubble for "pizzagate" referencing the QAnon-adjacent #SaveTheChildren conspiracy theory. Screenshot via ISD.

All of the listings described above have since been removed from the Redbubble website after they were flagged by Yahoo News in a request for comment. The company also issued a statement highlighting its moderation efforts while admitting the challenge of policing such a large volume of products.

“Redbubble is the world’s largest marketplace for independent artists; more than one million small creators upload tens of thousands of designs every day. In order to keep our marketplace safe and inclusive, we have a dedicated content safety team that proactively monitors for content that violates our guidelines, including designs that promote harmful misinformation, violence or racism,” Redbubble said.

“Thousands of images are removed for content safety reasons each month, normally within 1-2 days of upload, but the volume of uploads, technical limitations, and user circumvention means we won’t catch everything. A small percentage of images that do not comply with our policies may be present on our platform at any given time, such as those identified in your query, which have been removed. We go to great lengths to keep this type of content off the Redbubble marketplace and are investigating how these specific designs were not detected, and also updating our processes to prevent similar ones from remaining undetected in the future.”

The company added that it had not yet seen the ISD report, but it hoped “that it will help all online platforms improve in this challenging area.”

Coded language

While most of the platforms appear to have been successful at moderating and banning listings that use well-known key words or phrases associated with hate and extremism, analysts found that moderation tools tended to overlook slightly more esoteric or coded terms, allowing content that would otherwise be prohibited to evade detection. For example, searching for the phrase “white lives matter” on Redbubble produced innocuous results, but a search for “WLM” yielded T-shirts and stickers with the slogan on the first page.

This especially seemed to be the case on Etsy, which, unlike the other print-on-demand-focused platforms, largely caters to vendors who sell handmade and vintage items.

Like Redbubble, Etsy’s content guidelines are among the most stringent of the platforms highlighted by ISD. Yet, despite its “relatively exhaustive list of prohibited products,” including an explicit ban on symbols associated with Nazis and neo-Nazis, “many of the items ISD found on Etsy very clearly violate these guidelines.”

In particular, the report states that “finding neo-Nazi content was extremely easy on Etsy.”

Screenshot of products on website, with title reading: Figure 21: Black Sun merchandise on Etsy.
Black Sun merchandise on Etsy. Screenshot via ISD.

Searching the term “swastika” on Etsy led analysts to just one explicitly pro-Nazi item that was later removed. But the first page of results for the search term “1488,” a popular neo-Nazi and white supremacist slogan, contains both Nazi memorabilia and a variety of items emblazoned with the black sun, a popular neo-Nazi symbol.

The abundance of black sun products on Etsy previously came under scrutiny after the symbol was prominently featured in a racist manifesto published by the shooter who slaughtered Black shoppers at a Buffalo supermarket earlier this year.

Though these items are often labeled as “Celtic” or “Viking” in their Etsy listings, the ISD report argues that “if they appear when searching for e.g. explicitly neo-Nazi code terms, then they are clearly not innocuous.”

ISD recommends that “Etsy specifically should seek to understand how coded language relates to symbols supportive of hate groups.”

A spokesperson for Etsy was unable to provide a comment ahead of this story’s publication.

Screenshot of products on website, labeled: Figure 37: T-shirts from Teespring's 'Conspiracy Theories' section.
Screenshot via ISD.

Harmful narratives

Overall, the ISD report states that merchandise promoting harmful conspiracy theories was the most heavily represented category of concerning content found across all five sites.

Teespring, which rebranded as Spring in 2021, has had its own series of controversies and public apologies over its sale of offensive products and even has an entire section designated to conspiracy theory T-shirts, which feature slogans like “Red Pilled” and “a man can change the world with a [bullet] in the right place.” One shirt found on Spring promotes the “crisis actor conspiracy” associated with Alex Jones, who has falsely accused the victims of deadly mass shootings, including the 2012 attack at Sandy Hook Elementary School, of being staged.

Squirrell said that while a lot of the conspiracy content found on Spring and other platforms may be framed as a “harmless joke,” he insisted that it should not be dismissed as such, since “we know that so much of that stuff is actually the gateway towards much more serious concerning belief sets.”

In fact, many of the conspiracy theories promoted on shirts and stickers found across all five of the platforms have been linked to real-world harms, from QAnon to Pizzagate to 2020 election denialism and COVID-19 misinformation.

Squirrell said he was particularly distressed by the number of items being sold on these platforms that explicitly advocate violence against alleged pedophiles, as well as many others that seem to implicitly promote harmful narratives that false portray LGBTQ+ people and those who support them as “groomers” and “pedophiles”.

Screenshot of a T-shirt printed with Hey, Groomer! Leave those kids alone! for sale on website, labeled: Figure 46: According to the description of this product, 10% of profits will be donated to Gays Against Groomers.
Screenshot via ISD.

For example, ISD found one Spring store selling a variety of “SPECIAL EDITION — Gays Against Groomers” items, including T-shirts, tote bags and mugs. The description of these items states 10% of the profits form their sale will be donated to Gays Against Groomers, a prominent anti-transgender group.

Advocates have attributed the aggressive promotion of such narratives over the past year, including by Republican election officials in several states, to a substantial uptick in threats and attacks targeting the LGBTQ+ community.

Squirrell argued that, in light of this context, the volume of T-shirts and stickers emblazoned with messages like “dead pedophiles don’t reoffend” and “pedophiles aren’t people, throw ’em all in the woodchipper” is cause for alarm.

“I completely get the vilification of people who engage in child sexual abuse, obviously it’s a horrible act, which should be condemned in the strongest possible terms,” said Squirrell.

“The things which advocate the straight-up violence against people, particularly when you consider them in the context of the ‘groomer’ moral panic which we’re in the midst of right now, where queer identity is being conflated with pedophilia, I think is really quite concerning.”

Spring did not respond to requests for comment from Yahoo News, nor did Zazzle or Teepublic.

A larger challenge

E-commerce platforms aren’t the only internet forums that have struggled to moderate and prevent the spread of harmful content online.

Social media sites have constantly struggled to rein in hate speech and the advocacy of violence. Conspiracy theories and bigotry have found their way into almost every corner of the internet, from the comments sections on cooking recipes to content giants like YouTube.

The researchers at ISD concede that it “will never be possible to remove every piece of objectionable content from these sites, and in many cases, there are clear trade-offs to be made between freedom of expression and preventing the peddling of extremism, hate and misinformation.”

But they concluded that “the ease with which this material can be found and the lack of clear statements from platforms on where the borders of acceptability are indicates that policy in this area has not been thought through nearly well enough.”