Courts use a ‘Minority Report’ crime prediction algorithm, and it’s incredibly racist

I was somewhat surprised to learn that courts use software to predict the likelihood of criminals reoffending. But I was far less surprised to learn that the computer, much like the system it serves, seems to hate black people.

ProPublica has a new report that shines a light on the system used by Broward County, Florida. Those courts use a system made by Northpointe, a for-profit company. Various factors are inputted into an algorithm, which spits out a score that reflects an offender's chance of re-offending within two years.

DON'T MISS: Right now, Lyft is cheaper than the subway in NYC

Those scores are then used by judges to help with everything from bond amounts to sentencing. It's kind of like a credit score, only worse-informed, and used to make decisions about people's liberty, rather than car insurance. It's currently used in states including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin.

To measure the effectiveness of Northpointe's algorithm in the real world, ProPublica obtained the risk scores of 7,000 people arrested in Broward County, and tracked them for the next two years.

Surprise result! The computer was "remarkably unreliable" in predicting violent crimes: only 20 percent of people predicted to commit violent crimes actually did so. That figure only rises to 61 percent when considering all crimes.

What makes the report -- and yes, there is something worse than computers using flawed methodology to lock people up -- is the racial bias. ProPublica found that it falsely flagged black defendants at twice the rate that it did white defendants.

On the flip side, white defendants were mistakenly labelled as "low risk" more often than black defendants.

Northpointe disputed the results of ProPublica's findings, and wouldn't release the exact algorithm it uses to compute risk scores. So, in conclusion, a computer is incorrectly classifying individuals as high or low risk, using a formula that it won't disclose, but is objectively racist. And courts are still using the algorithm to influence judge's decisions. Right.

Related stories

Police confiscate drug dealer's phone, get bombarded with calls looking for drugs

The 15 cities with the lowest quality of life in the world

How a security director used a rootkit to rig the lottery and steal millions of dollars

More from BGR: Watch the Britney Spears BMA performance that the internet is going crazy over

This article was originally published on BGR.com