Predators are using AI to sexually exploit children, FBI says. Here’s what we know

As the use of artificial intelligence grows, officials across the world are expressing their concerns about its use in the creation of child sex abuse material.

On Sept. 5, 54 attorneys general sent a letter to the U.S. Congress asking members to intervene.

“As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions,” the letter reads.

In June, the Federal Bureau of Investigation issued a public warning stating predators have been “creating synthetic content (commonly referred to as ‘deepfakes’) by manipulating benign photographs or videos to target victims.”

The warning says predators often use content taken from social media or other online websites. Then, they alter them to depict sexual acts.

“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content. The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes,” the FBI says.

The National Crime Agency, a law enforcement agency in the United Kingdom, says up to 830,000 adults in the U.K. pose “a sexual risk to children.”

“We assess that the viewing of these images – whether real or AI generated – materially increases the risk of offenders moving on to sexually abusing children themselves,” NCA Director General Graeme Biggar said in a report.

Biggar said the agency has begun to see hyper-realistic images and videos entirely created through artificial intelligence.

“The use of AI for child sexual abuse will make it harder for us to identify real children who need protecting, and further normalize abuse,” Biggar said in the report.

“One day in the near future, a child molester will be able to use AI to generate a deepfake video of the child down the street performing a sex act of their choosing,” Ohio Attorney General Dave Yost said in a news release. “The time to prevent this is now, before it happens.”

In April, a Canadian man was arrested and sentenced to over three years in prison after he was accused of using AI to create synthetic videos of child pornography, according to CBC.

“The use of deepfake technology in criminal hands is chilling. The type of software allows crimes to be committed that could involve virtually every child in our communities,” Judge Benoit Gagnon said, CBC reported.

Officials are urging people to be careful with the content they post on social media and online.

“Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity,” the FBI said in its report.

To help protect yourself and your child, the agency suggests regularly searching your children’s information, using privacy settings on social media, enabling multifactor authentication for security and avoiding talking to people you don’t know online or who seem like they’ve been hacked.

“A simple video excerpt of a child available on social media, or a video of children taken in a public place, could turn them into potential victims of child pornography,” Gagnon said, according to CBC.

The letter sent to Congress asks members to establish an expert commission to study how AI is used to generate this content.

“Second, after considering the expert commission’s recommendations, Congress should act to deter and address child exploitation, such as by expanding existing restrictions on (child sexual abuse material) to explicitly cover AI-generated CSAM. This will ensure prosecutors have the tools they need to protect our children,” the letter says as a second call to action.

Man created ‘deepfake’ porn of former classmates using their old photos, prosecutor says

New Instagram feature to restrict messaging between adults and teens. What to know

FBI contractor threatened suicide if teens didn’t send him explicit images, feds say