Is artificial intelligence making racial profiling worse?

REVERB is a new documentary series from CBSN Originals. Watch the first episode, "Racial Profiling 2.0," in the video player above.

Throughout its history, the LAPD has found itself embroiled in controversy over racially biased policing. In 1992, police violence and the acquittal of four police officers who beat black motorist Rodney King culminated in riots that killed more than 50 people. Many reforms have been instituted in the decades since then, but racial bias in LA law enforcement continues to raise concerns. A 2019 report found that the LAPD pulled over black drivers four times as often as white drivers, and Latino drivers three times as often as whites, despite white drivers being more likely to have weapons, drugs or other contraband.

New technological tools employed by the department could be aggravating the problem. In an effort to further reduce crime, the LAPD has turned to big data.

Traditionally, police have stepped in to enforce the law after a crime has occurred, but advancements in artificial intelligence have helped create what are called "predictive policing" programs. These algorithm-driven systems analyze crime data to find a pattern, aiming to predict where crimes will be committed or even by whom. The idea is to stop crime before it happens by directing police to locations or people to target — following the hard, supposedly unbiased data. In the last decade, some of the largest police departments in the country have turned to predictive policing to reduce crimes in their communities, and the LAPD has helped to pioneer the trend.

Manhattan D.A. on Weinstein case: "These survivors weren't just brave, they were heroic"

Teen asks girlfriend to prom using braille

Harvey Weinstein found guilty on 2 counts, acquitted on 3 others