AI could exacerbate biases in criminal justice, hiring, NJIT conference warns

Artificial intelligence is transforming the world, but scientists at a New Jersey Institute of Technology conference on Friday warned that it's not immune to historic biases against women and people of color.

Speakers at the 2023 Women Designing the Future Conference, all female, examined everything from the likelihood of AI becoming an overpowering force that takes over the world to more subtle concerns about fairness, accountability, and transparency.

In particular, they argued, faulty technology has already led to discrimination.

Julia Stoyanovich, an associate professor of computer science, told the NJIT conference about efforts to ensure that AI hiring software is not biased.
Julia Stoyanovich, an associate professor of computer science, told the NJIT conference about efforts to ensure that AI hiring software is not biased.

The daylong conference, organized by the university's Murray Center for Women in Technology, had as its central theme, "Artificial Intelligence/Real Human Lives: making technology work for all of us."

Facial recognition and faulty convictions

Renee Cummings, an artificial Intelligence ethicist and professor at the University of Virginia's School of Data Science, said communities of color have been harmed by over-policing based on AI that's informed by faulty and discriminatory data.

She cited the case of Paterson resident Nijeer Parks, who sued Woodbridge Township in 2020 alleging that the town used faulty facial recognition technology that led to a warrant for his arrest. Parks, whose case is still pending, said he'd never visited Woodbridge.

AI systems are trained on past data, but "historic datasets" affected by past discrimination continue "to create a lot of the trauma, a lot of the tears, a lot of the pain and a lot of the problems that we are seeing with facial recognition technology and with the technologies that are being employed," Cummings said.

Rebecca Brown, director of policy at the Innocence Project, started off by showing a video of some of the wrongly convicted people the nonprofit group has helped to exonerate. She said researchers have found that about 4% of people in U.S. prisons are innocent. The Innocence Project knows of five men, all Black, who have been misidentified and arrested due to facial recognition technology, she added.

"What happens when the entry point into a wrongful conviction is not misconduct, per se, but instead is flawed technology," Brown said.

More: License plate readers can track car thieves across NJ. Why that worries some privacy advocates

Could biased AI cost you a job?

Julia Stoyanovich, an associate professor at New York University who specializes in computer science, engineering, and data science, laid out how artificial intelligence used by companies for hiring employees can be problematic when algorithms reflect bias against minorities and women.

Employers have embraced automated hiring software for efficiency, and supporters argue it can promote fairness by taking initial decisions on who gets interviewed out of the hands of humans. But skeptics worry AI systems based on past hiring practices will simply automate old racial and cultural biases.

Stoyanovich said there are efforts to fight back against this, pointing to a New York City law passed in 2021 that requires employees to be notified about the use of AI tools in hiring or promotions. A similar proposal is working its way through New Jersey's Legislature.

Stoyanovich cautioned: "We need to keep our heads as humans when we think about what we allow AI to do that's too tricky for us."

Story continues after gallery

Bringing communities of color into the conversation

Sarah Chu, a senior adviser on forensic science policy at the Innocence Project, said the group has been seeking a moratorium on surveillance technology in the criminal legal system to limit the harm to innocent people. She also said that communities impacted by this technology have to be included in discussions by those agencies who use it.

"Think about whose agenda is represented at the table when technology is being purchased, when it is being implemented. It is certainly not the communities that are impacted," Chu said.

Ricardo Kaulessar is a culture reporter for the USA TODAY Network's Atlantic Region How We Live team. For unlimited access to the most important news, please subscribe or activate your digital account today.

Email: kaulessar@northjersey.com

Twitter: @ricardokaul

This article originally appeared on NorthJersey.com: Artificial intelligence, race, gender subject of NJIT 2023 conference