Hawaii legislators target deepfake political messaging

Jan. 30—In an effort to keep artificial intelligence — or deepfake — messaging out of Hawaii elections, two bills would ban false information of a candidate or party, and a third would make it a petty misdemeanor to distribute — or conspire to distribute — fake political messages.

Senate Bill 2687 would create a new petty misdemeanor for "distributing, or entering into an agreement with another person to distribute, materially deceptive media unless the media

contains a disclaimer" that could rise to a Class C felony for anyone who intends "to cause violence or bodily harm."

House Bill 1766 and its companion, Senate Bill 2396, would leave investigations and fines up to the Hawaii State Ethics Commission, while the court system also could assess financial awards to anyone harmed by fake messaging distributed within 90 days of a primary or general election.

Political deepfake messages are already banned

in California, Michigan, Minnesota, Texas, Washington and Wisconsin, according to HB 1766 and SB 2396.

State Rep. Trisha La Chica (D, Waipio-Mililani) authored HB 1766 to keep deepfakes out of Hawaii's political elections, which will see a likely rematch between President Joe Biden and former President Donald Trump in November's general election.

Especially for small House and Senate races, "we don't want deepfakes being political weaponized in our elections," La Chica told the Honolulu Star-Advertiser

on Monday.

A winning House or Senate candidate often needs only 3,000 votes or so.

And the outcome could be determined by "a handful of votes that could very easily be swayed by deepfake messaging in the critical hours before the vote," La Chica said. "There's no place in Hawaii for that."

State Sen. Karl Rhoads (D, Nuuanu-Downtown-Iwilei) introduced SB 2687 and SB 2396 out of concern for the sophistication of "the manipulations you can do with AI these days that can make it convincingly look like someone is saying something they never said or

doing something they never did. There's got to be some penalty if you get caught."

Especially for the November presidential election, Russia has a vested interest in Trump winning because America's support for Ukraine remains at stake, Rhoads said.

"The Russians have every incentive to work that angle as strongly as they can," Rhoads said. "If Trump wins, support for Ukraine will likely diminish to zero. They have a lot of stake and are willing to put a lot into that effort."

Some campaigns in Europe and the mainland have included false messaging

generated by artificial intelligence and fake images depicting events that never happened, such as Florida Gov. Ron DeSantis' use of a fake video of Trump hugging former White House medical adviser Anthony Fauci, said Colin Moore, who teaches public policy at the University of Hawaii and is associate professor at the University of Hawaii Economic Resource Organization.

In Slovakia, Moore said, a deepfake video depicted a party leader purportedly talking about rigging an election.

So he called HB 1766 "a good bill."

"Policymakers are just starting to understand how deep fakes and synthetics are used in electioneering to manipulate the messaging," Moore said. "We have a responsibility ... to protect the integrity of elections."

For national and other high-profile elections conducted in Hawaii, Moore said HB 1766 and SB 2396 would mean that "having

a law on the books would make any mainstream political organization or group very, very cautious about using deepfakes."

SB 2687, which would criminalize distribution of deepfake political messaging, states that artificial intelligence has "dangerous consequences if applied maliciously. For example, the use of deepfakes or generative AI in elections can be a powerful tool used to spread disinformation and misinformation, which can increase political tensions and result in electoral-related conflict and violence."

Exceptions would be made for deepfake political messaging that "includes a disclaimer (running throughout the entirety of a video) informing the viewer that the media has been manipulated by technical means and depicts appearance, speech, or conduct that did not occur."

Spreading lies and misinformation is hardly new in American politics "and dates back to the very first elections in the country," Moore said. "Politics being

a rough sport, it's nothing new. But now it's the volume of it, and the technology has made it more widespread."

As the 2024 election year heats up — especially with Trump and Biden likely to square off again — Moore said there "absolutely ... will be tremendous amounts of misinformation."

"This is just the world we live in now," he said. "Every election from here on out is going to be full of misinformation and disinformation on social media."

But voters have more control over the spread of political misinformation than they believe, Moore said.

"If you see it, think very carefully before you like or retweet something," Moore said. "People don't think about how that's how it spreads."