Is Facebook’s political ad stance too little, too late?

In this article:

Facebook CEO Mark Zuckerberg announced some changes the social media giant will be undergoing as the 2020 presidential election approaches. Facebook won’t accept new political ads the week before the election in addition to monitoring content that may ‘delegitimize’ the election. The Final Round panel breaks down the details.

Video Transcript

MYLES UDLAND: All right, let's turn our attention now to what's going on in the world of Facebook-- a stock that's down 4.6% today-- off more than the market. But as we mentioned just a few minutes ago, today is exactly two months from the presidential election. And certainly, folks would like to know what Facebook is doing to prevent a repeat of 2016.

Now, they've outlined a series of steps, including their pledge not to accept new political ads in the week before the election. They're going to be adding labels to content that they are deeming-- they're not using the term, fake news, but basically things that are untruthful, but framed through either an advertising or a news media type lens.

And, Melody Hahm, it's not-- I guess I'm always a little bit confused or surprised when we see these sorts of announcements from Facebook, because it's like, one, well, what have you been doing in the interim? And two, I thought they were already doing this kind of stuff, but apparently not. And I'm also not sure what the one week before the election demarcation really means. I mean, so now it's fine if a candidate for any office wants to say whatever they want, but in seven weeks from now, then the firehose shuts down.

MELODY HAHM: Oh, Myles, it does absolutely nothing. The Pandora's box has long been opened. We've already echoed this before. Zuckerberg has been aware of this sort of unsavory misinformation and disinformation circulating on all of its platforms, including Facebook, messenger, WhatsApp, and Instagram. And to your point, this is such a Band-Aid. We needed a full body operation.

I think what's most disturbing to me is that in a blog post, Zuckerberg wrote, this is not business as usual. When was it business as usual? Since the Cambridge Analytica scandal, it shouldn't have been business as usual. We knew that 2020 election is coming. This has been the number one calendar event over the last four years. So for this to be this kind of a paultry response, in my mind, is pretty pathetic. I'll just lay it out there.

I think one thing that is interesting, in addition to, you know, the minor labels and the sort of reminders and redirecting people to the official sites, which are all sort of cosmetic additions, right-- it's a very superficial approach. People who are getting their information and their news on Facebook are not likely going to be clicking that link to redirect themselves to that official site. It would have gone to that official site in the first place if that was their goal, to get factual information.

So one thing that is particularly interesting to me is that they've seen a lot of resurgence in this idea that you shouldn't be going to vote, because you can catch COVID. And that's been kind of an ongoing trend with all these posts, a lot of people who are saying, you know, you should avoid going there. Perhaps that is echoing some of what Trump's sentiment is. And Zuckerberg and his team did say that they would be pulling down all of those posts as well.

But I think the larger issue at hand is we know they hired an influx of content moderators. Of course, their treatment and their workloads have been exhausting and mentally excruciating. But that's not even nearly enough. The algorithm, the AI system that they had heralded as kind of revolutionary in pulling down bad content, it's very clear it hasn't worked. And then the human side of things, even together, it's clearly not enough support to really stop this kind of really insidious misinformation campaign that's been happening on the platform.

MYLES UDLAND: Yeah, I think it's interesting if we kind of go forward to, let's say, 2024, right-- so 2016, Facebook is basically, like, anything can happen on the platform. We don't even know what goes on here. It's all fine. 2020, they say, well, we're going to label some things, and we've got the moderators and the AI, and so now we're not responsible for the content, but we are trying to clean up the content.

I feel like 2024 is this moment where Facebook either says there will be no news on our platform of any kind-- no sharing of third party data-- or we are just going to have to be solely responsible for everything that does get posted. And you, normal citizen, cannot post news information or whatever it is anymore. All you can do is share from, let's say, a third party verified account, so on and so forth.

Because, clearly, whatever has happened in the last four years is essentially changing nothing about how the platform looks, feels, and behaves ahead of an event like this. And I'm sure that, you know, Zuckerberg knows that, but he's just very good at saying things that don't really mean anything, but sounding like they do. But hey, you know, you can either be liked or you can be worth $100 billion. You probably can't-- you probably can't have both.

Advertisement