Elon Musk reposted a deepfake video of Kamala Harris on X — it may violate his own platform's policy

  • Elon Musk reposted a deepfake video of Kamala Harris on X on Friday night.

  • The video, a parody of Harris' campaign ad, appears to have been digitally altered.

  • Musk's repost lacked context, potentially breaching X's rules on synthetic and manipulated media.

On Friday evening, Elon Musk reposted a deepfake video of Vice President Kamala Harris on X — a move that may violate his own platform's policy on synthetic and manipulated media, The New York Times reported.

The video was originally posted by the user @MrReaganUSA, who noted that the clip was a "parody" of Harris' first campaign ad since becoming the presumptive Democratic Party nominee for the 2024 presidential election.

The clip appears to have been digitally altered to add a new voice-over that sounds like Harris.

In the video, the edited voice-over says, "I was selected because I am the ultimate diversity hire. I'm both a woman and a person of color, so if you criticize anything I say, you're both sexist and racist."

The deceptive voice-over also calls Biden senile and says Harris and Biden are "deep state" puppets.

In his repost of the clip, which has been viewed more than 117 million times, Musk failed to note that the video had been edited, writing only: "This is amazing 😂."

And that may just run afoul of X's policy on synthetic and manipulated media, which states: "You may not share synthetic, manipulated, or out-of-context media that may deceive or confuse people and lead to harm ("misleading media")."

X says that for the company to take action and remove or label a post that violates that policy, it must "include media that is significantly and deceptively altered," "shared in a deceptive manner or with false context," or that is likely to cause "widespread confusion on public issues."

The company says that it will consider factors including "whether there are any visual or auditory information (such as new video frames, overdubbed audio, or modified subtitles) that has been added, edited, or removed that fundamentally changes the understanding, meaning, or context of the media."

The deepfake boom

Deepfakes use artificial intelligence to replace a person's likeness with that of someone else in video or audio footage.

Audio deepfakes are relatively simple to create but are difficult to detect, studies have found.

A number of politicians have already fallen victim to the technology in the past, highlighting their potential to wreak havoc around election times.

In one clip that was circulating on social media last year, Hillary Clinton appeared to give a surprise endorsement of Florida Governor Ron DeSantis. However, the clip was revealed to have been AI-generated, Reuters reported.

Biden was also on the receiving end of a deepfake following his announcement that he was dropping out of the 2024 presidential election race.

A video on social media appeared to show the president hitting out at his critics and cursing them. But again, the footage was a deepfake, per the AFP news agency.

Read the original article on Business Insider