Trippy video breaks down how YouTube demonetizes LGBTQ creators for using words like 'gay'

It's long been rumored that YouTube's machine learning algorithm specifically demonetizes LGBTQ content — despite YouTube's denials.

But the allegations got a boost over the weekend when two creators published videos pointing out how they think YouTube goes about blocking LGBTQ content from making money.

YouTube Analyzed fed more than 15,000 words used in titles to YouTube's bots to figure out what the platform deems too scandalous for advertisers. Words that triggered demonetization in his tests include "gay" and "homosexual."

Another channel, Nerd City, then broke down YouTube Analyzed's data in the acid trip of a video seen above. Before tackling YouTube's shadowy demonetization policy, Nerd City was known for diving into Jake Paul's insidious marketing schemes, exposing family vlogging channels as exploitative, and highlighting obviously edited Instagram photos as bad for young followers' self esteem.

Nerd City explains that YouTube Analyzed rated tested words on a color scale. Green denotes "safe" words while "yellow" words trigger demonetization. 

On this scale, "straight" and "heterosexual" are solidly in the green. More obviously alarming phrases, like "fuck a duck" and "gore" are yellow. But creators who add "lesbian" to their video tags and descriptions lose advertising revenue. 

As Nerd City notes, the content doesn't have to be sexual to get spotted by the demonetization bots: A video title that mentioned lesbian couples getting married was flagged, but one about "happy" couples wasn't. 

Other creators are also losing income from YouTube's demonetization system. The channel Armchair Historian quit making educational videos since his history lessons, which he says don't cover controversial topics, were repeatedly flagged for demonetization. While some YouTubers have partnered with a trade union in a call for transparency, YouTube says it can't be any more transparent or else bad actors will try to game the system.

A YouTube spokesperson told Mashable that their machine learning system isn't perfect, but the platform doesn't have a list of "forbidden" LGBTQ-related words. Here's the full statement:

The spokesperson also noted that contrary to the publicly available spreadsheet of trigger words, created by YouTube Analyzed, YouTube doesn't have a list of words that trigger demonetization. 

Nerd City counters that party line with the following logic: Even though YouTube may not have an explicit list of trigger words, its employees teach the algorithm to flag so many videos with such words in descriptions that the AI has developed its own watch list so to speak. 

It's confusing and frustrating and a system that needs to be fixed.