YouTube's algorithm is hurting America far more than Russian trolls ever could

Remember when we used to think the greatest threat we faced from Artificial Intelligence was that it would become Skynet and launch our entire nuclear arsenal, wiping out the human race? 

Good times. 

But on the evidence of YouTube's latest mess, AI doesn't even need to bother blowing us up — it can just dumb us down and push endless conspiracy theory videos on us. That way we'll get so confused about the truth, we won't take any action to stop killing ourselves.

SEE ALSO: On Skynet's 20th birthday, it's time to admit AI isn't a real threat

What YouTube mess do I mean, exactly? Oh, just the fact that the Google-owned video company has pushed videos about the Parkland shooting survivors — specifically, a clip created to support the outrageous lie that one student is a "crisis actor" because he showed up in a news segment in California some months back — to the very top of its influential Trending section. 

Sure, Russian trolls are doing their level best to sow discord and doubt among Americans over this horrific school shooting — but they couldn't do half the damage that YouTube's algorithm and its lack of meaningful human oversight has already done to U.S. politics, not to mention life well beyond the political realm. 

The company later took the video out of Trending and apologized: "Because the video contained footage from an authoritative news source, our system misclassified it," they said in a statement.

Yet the video is still active on the service. At the time of writing it had garnered more than 200,000 views. It had inserted itself into the political conversation: an aide to a Florida representative emailed the video to at least one journalist as proof for the rep's bogus "crisis actor" claims. (That aide was later fired.)

If this were an isolated incident, it might not be such a big deal. But you need only search for the student's name — David Hogg — on YouTube to see that the service is shot through with such problems. Take the first example to appear high up in my search feed: an account from someone calling himself "Jake the Asshole" posts a badly videoed CNN interview with the student, calls him a crisis actor, and gets a quick 17,000 views for his trouble. 

By the way, Jake the Asshole also posts videos about how the moon landing was faked, the Earth is flat, and North Korea doesn't actually exist. 

Of course, in the YouTube ecology, Jake is just a minnow. Compare him to InfoWars founder and conspiracy theorist Alex Jones, who's infamous for claiming the Sandy Hook school shooting was a "false flag" and that its grieving parents were actors. As I write this, Jones is hosting a live YouTube show on how the Florida kids have been "groomed for major anti-gun propaganda push." He has 2.2 million subscribers. 

But talk to a random employee at YouTube's San Bruno, Calif. HQ, and you'll find someone who likely hates Jones' noise. This is all about the algorithm pushing the equivalent of empty calories on the viewing public because the AI thinks that's what we want.

The company's original sin is the same as that of Google, Facebook, and Twitter: It doesn't believe that it's a media company as well as a technology company. If I had a nickel for every executive at these companies I've talked to over the years who simply smile and insist they don't have a journalist's obligation to check their content, I'd have enough to hire them a fact-checker.

But while Facebook and Twitter have taken most of the heat for unwittingly hosting Russian bots and trolls during the 2016 election, YouTube arguably did more to swing the vote. 

Just take a look at the viewing figures on fake news videos in this recent Guardian study and try not to feel queasy. You know how YouTube queues up videos for you to watch next? Out of 643 of these partisan videos that were recommended to people watching politics content in 2016, 551 were conspiracy-laden videos favoring Trump while 92 favored Clinton. 

So much for the liberal media. A reminder: Trump won the electoral college by less than 80,000 votes. These videos had millions of views apiece. America is a supremely visual culture.

YouTube's job, as it sees it, is to get as many eyeballs on as many videos as possible. It's as if a media tycoon founded a newspaper, invited every conspiracy theorist to contribute, and blithely waved away the notion that there should be any ethical responsibility to put forth the verifiable truth — because selling ads was all that mattered.

SEE ALSO: Logan Paul isn't the only problem. YouTube is broken — here's how to fix it.

Oh, the right noises are made at the right times. After the Logan Paul suicide video fiasco, YouTube CEO Susan Wojcicki wrote in a blog post that she aimed to bring "the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018." What did that mean, exactly? Nobody knows, and Wojcicki hasn't offered any updates. 

But as YouTube's own apology for the Parkland conspiracy theory video makes clear, humans still aren't in charge of the situation. There doesn't even appear to be a team looking at the #1 most trending video around the clock; surely that would take rather less than 10,000 people. But no, San Bruno still worships at the altar of the algorithm. 

The result is the cesspool that much of YouTube has become. Creators find that the algorithm rewards them for shocking and unusual content. And many of them find there's nothing more shocking and unusual than a good conspiracy theory. Millions of viewers are waiting to lap up an easy explanation of the world that, for example, helps them avoid thinking about the tragedy of children being shot by assault weapons in schools. 

Most network news directors would have a hard time sleeping if they filled the public airwaves with this nonsense. For YouTube, it's just another revenue stream. Until the company actually clamps down on the bullshit that streams forth from Alex Jones and his ilk — if you don't want to ban the content altogether, perhaps a big red "fake news" tag? — we shouldn't believe a word they say to the contrary. 

Because it isn't really executives like Wojcicki who speak for YouTube as it stands. Its real spokesperson is an out-of-control AI, ever hungry for eyeballs, looking to get its hooks into your brain. Now there's a conspiracy theory that also has the benefit of being true. 

WATCH: The U.S. Government Accuses Two Chinese Phone Makers of Spying on Americans - Here's Why We Think It's Bullsh*t