Facebook, Twitter defend election policies

Yahoo Finance’s Brian Sozzi, Myles Udland, and Julie Hyman speak with Prevailion CEO Karim Hijazi about yesterday's big tech hearing.

Video Transcript

MYLES UDLAND: Yesterday we saw Facebook CEO Zuckerberg and Twitter CEO Jack Dorsey on Capitol Hill testifying before lawmakers on election integrity as well as facing some questions about section 230 regulations. And joining us now to continue that conversation, we're joined by Karim Hijazi. He is the CEO at cybersecurity firm Prevailion. Karim, great to speak with you this morning.

So yesterday, kind of a jumbled sort of hearing. I know it was about election security, and section 230 came up there as well. Let's maybe take election concerns and kind of the broad integrity, I suppose we could say, of the state of the internet in the US, if that's a way to think about things. Are you as concerned as some lawmakers are about the kinds of interference that we've seen come up in elections, the ways in which regular Americans are finding their internet experience essentially manipulated by nefarious actors, be they based in the US or elsewhere?

KARIM HIJAZI: Yeah, certainly it's a serious concern I think that we've been talking about this all year, you know, prepping for the actual election and when it really-- when the rubber meets the road, so to speak. And I think that it's not changing. I mean, the reality is that social media is a fact of life now. You know, I don't think it's going away. I think with all the talk of regulating it in some ways, if anything, it will foster other challenges in.

So for example, you know, some of the AI concerns people have had about this to say, hey, let's take this out of the hands of human analysts that have a bias and give it to an AI apparatus, that has its inherent challenges too because those can be manipulated. It's all about what you give it. And if you educate it with something that's biased by volume, it will slant in that direction.

So I think what the adversaries have capitalized on, you know, foreign or domestic, is they've realized how these systems work. And if you are able to work the curation systems that these social-media systems need for their revenue generation, that's really-- that's what's so complicated here is that the, you know, protection components are directly related in a lot of ways to the curation engines that they have to use to make money.

So to kind of just get rid of it altogether is just untenable. So it's a really complicated answer, and unfortunately it's just fodder for the adversary in a big way.

MYLES UDLAND: Yeah, and I guess, Karim, maybe I should ask it this way. I think in kind of the media and political circles, right, everyone says election interference, right? That would seem to be the primary concern. But, I mean, you're talking about the way the algo actually works on the site, the way it's actually coded and the loops it's actually designed to respond to. As someone in the space, what are you most concerned about, I suppose, with the systems that exist in place, be they on social-media platforms or just the kinds of things that we probably encounter in our regular internet usage that we don't really understand or we don't even know that we're sort of being--

KARIM HIJAZI: Sure.

MYLES UDLAND: --programmed for certain behaviors?

KARIM HIJAZI: Yeah, definitely, Myles. So it's absolutely multifaceted. That's the biggest hard-- that's the hardest part about this. You know, whatever your feelings are about, you know, Krebs, for example, CISA had a really hard job. You know, they continue to have a very hard job because the vectors of attack are numerous and wide open. So to your point, general internet access, people working from home, social-media influencing campaigns, hacking actual environments and changing, you know, things fundamentally are all on the-- you know, up for the taking right now. So you're absolutely right. I mean, the issue is this sort of combined, collective effort.

The harder ones to navigate are the ones that seem like they're using the systems that are in place by design to work that way, right? So you can't-- it's hard to differentiate between what looks like it might be fabricated versus what's real. And if you're building a curation engine to feed someone something and they're using the system to its own demise, that's the hard part.

The hacking stuff, which is really where we live, a little bit more black and white, a little more binary, right? We can say, hey, this was manipulated. This is fundamentally changed, and therefore it's easy to kind of call out things.

The ones that are sort of using the system against itself, those are the harder ones here that, you know, even us in the industry have a hard time sort of figuring out the solution around.

BRIAN SOZZI: Karim, do you think that regulation of any kind against these social-media companies, that it would ultimately stifle innovation to the point where fixing misinformation on the platforms, it just simply won't get done?

KARIM HIJAZI: Absolutely because that's what-- the subjective nature of this is the problem. The provenance of where the data comes from is really the key here, and in many cases-- and that's at the heart of this whole section 230. There's really no clear, I think, understanding about how to figure out what the provenance of some of this is, and the amount of effort and research that goes into that is really difficult because you're having to go and determine what is being said is accurate and factual versus an opinion or slant on something. And then, you know, what is-- how do you decide whether something gets taken out that is just-- like just partially incorrect and part of it is?

And then do you sort of say-- you know, there's algorithms for that, but they're going to err toward curating it and giving it to the viewership that has been looking for that type of information anyway. So to carve it out entirely does add a layer of censorship that is beyond anything we've ever seen and probably could disrupt the revenue streams of these organizations, which is why they're pushing back.