Law enforcement using AI to solve crimes, including 2020 Nashville bombing

NASHVILLE, Tenn. (WKRN) — As technology evolves, so do the criminals who use it to exploit people. Artificial intelligence is no different. Criminals are using it. But so is law enforcement so they can stay one step ahead.

It was during the quiet early hours of Christmas morning 2020 when a bomb went off, killing the bomber and destroying Second Avenue and Commerce Street. In the initial days and weeks, artificial intelligence played a critical role in the investigation.

“One of the great advantages of AI for us is video review,” said FBI Supervisor Special Agent Kevin Varpness.

Varpness works in the Nashville FBI office. In past investigations, his team would have to go through thousands of hours of security camera video from that bombing, carefully combing it for suspect clues and motivations, but not with AI.

📧 Have breaking come to you: Subscribe to News 2 email alerts

“Now we can have machines go through it like that,” said Varpness, snapping his fingers. “So it ingests large volumes of this video and is able to pick out certain parts more relevant than others that vastly saves swaths of time.”

This form of AI has helped the Feds investigate the Las Vegas mass shooting — the deadliest in American history.

As well as the Austin serial bomber, an event that terrorized Texas for weeks. But it’s not just criminals at home who can exploit AI, Varpness says he’s deeply concerned about foreign actors.

“I’m highly concerned about China’s use of AI. What they’re currently doing even with their surveillance system, and what I know they’re trying to do with big data.”

He says China has a number of tactics to steal our artificial intelligence. One is as easy as doing business in the U.S. A Chinese company can purchase an American tech startup, obtain the AI technology, and that whole transaction is perfectly legal.

“The People’s Republic of China is the greatest threat to the United States right now.”

⏩ Read today’s top stories on wkrn.com

As open-sourced AI platforms expand, the fear is that it can also target individuals, including our children. Take sextortion – a criminal using a nude picture or video of a child in order to get money from that young victim. With a new type of AI-generated deep fake that photograph doesn’t even have to be real.

“If they get your picture, they can put it on another person’s body and the AI could make that look real. But, now they even have AI out there that can artificially generate a body, so it’s definitely a real problem,” said Varpness. “It’s a terrible situation to be in.”

Here’s a website you’ll want to know if you or a loved one are ever a victim: www.IC3.gov. Put in the details of your crime, and law enforcement will review it. This portal also helps them connect the dots and see if they have a larger case.

For the latest news, weather, sports, and streaming video, head to WKRN News 2.