A college security system flagged theater props as firearms. Does AI gun detection work?

Earlier this year, in an office building thousands of miles away from Rochester, a technician with the security firm ZeroEyes received an alert: There were two guns inside an auditorium on a college campus in Rochester.

Artificial intelligence software connected to St. John Fisher University's security cameras were scanning video feeds second-by-second, looking for firearms. It flagged the weapons as a threat and sent a screenshot to the technician to verify.

The tech saw a person with a gun and dispatched law enforcement to the university.

Inside Cleary Family Auditorium at Kearney Hall, community members were rehearsing for an upcoming play. The guns they were holding were theater props.

ZeroEyes software didn’t know the difference.

St. John Fisher University went into a brief lockdown June 18th after AI software in the school's surveillance system mistook prop guns being used in a theater rehearsal as real guns. A view of Kearney Hall, the school's original building.
St. John Fisher University went into a brief lockdown June 18th after AI software in the school's surveillance system mistook prop guns being used in a theater rehearsal as real guns. A view of Kearney Hall, the school's original building.

Technology is changing the way our communities approach public safety. Drones act as eyes in the sky, in some areas scoping out crime scenes before law enforcement arrive. New mapping technology is helping police connect surveillance cameras from across the county into a single cloud-based grid for their easy access. AI software can be trained to read license plates or search through camera feeds for various physical descriptors.

And it is forcing humans to change the way they respond to that technology in turn.

Eyes on Us: Monroe County Sheriff's Office wants to tap into video feed at your school, corner store

St. John Fisher University goes into lockdown over prop guns

“If we have a call for someone with a gun on campus, we’re going to respond in force,” Monroe County Sheriff’s Office spokesman Deputy Brendan Hurley said. “We’re going to treat it as if it’s real until we can verify that it wasn’t.”

In this case, university officials were able to make contact with the auditorium and identify the fake guns before police arrived ― defusing what potentially could have been a dangerous confrontation otherwise.

The software company ZeroEyes uses artificial intelligence to detect firearms in security or surveillance feeds at schools, government buildings and commercial businesses. After AI flags a gun, a ZeroEyes employee must verify the detection before alerting police.
The software company ZeroEyes uses artificial intelligence to detect firearms in security or surveillance feeds at schools, government buildings and commercial businesses. After AI flags a gun, a ZeroEyes employee must verify the detection before alerting police.

St. John Fisher declined to make its security personnel available for an interview about what protocols are in place to assess an alert from the AI technology before they respond.

“Although it turned out that it was not a lethal threat, we operate with a cautious approach to gun detection,” a spokesperson said in an email. “Our partner’s quick notification, actionable intelligence, and situational awareness is crucial when faced with an image that so closely resembles a real gun.”

The university went into lockdown for about 15 minutes.

Deputies from the sheriff’s office still responded to clear the scene. Hurley said it was the police agency’s first time interacting with ZeroEyes and said communication with St. John Fisher was crucial in preparing their response.

“We prepare for the worst-case scenario,” he said. “But again, we always have to remember in the back of our heads, this may not be what it’s called in as. That’s the delicate balance of policing. We’re trying to respond ready, in case it’s something very bad, but also don’t overreact.”

What is ZeroEyes?

An official from ZeroEyes said the human verification piece of their program was intentionally designed to prevent overreliance on artificial intelligence technology.

The AI software cannot send an alert to the company’s customers itself. Technicians, who are former veterans and law enforcement personnel, must review a screenshot of whatever image triggered the program before clicking a button to broadcast the number of purported guns and their location.

The company claims the entire process ― AI detection to human verification to notification ― happens within three to five seconds.

St. John Fisher University went into a brief lockdown June 18th after AI software in the school’s surveillance system mistook prop guns being used in a theater rehearsal as real guns. A security camera sits above the corner of a residence hall.
St. John Fisher University went into a brief lockdown June 18th after AI software in the school’s surveillance system mistook prop guns being used in a theater rehearsal as real guns. A security camera sits above the corner of a residence hall.

Sam Alaimo, one of the company’s founders, said ZeroEyes does not consider what happened at St. John Fisher a false positive.

“Two prop guns were painted black and they looked exactly like real guns,” Alaimo said. “When we have a situation where there is a gun in hand, we are going to dispatch it. We err on the side of safety.”

The AI tool was trained with over one million images of firearms in different environments. But it cannot differentiate between real guns and fake ones ― and the company doesn’t want it to. Some real guns look like toy guns, Alaimo said.

The technology cannot detect firearms that are concealed or in a holster.

Two SEPTA Transit Police Officers patrol Center City regional stop. SEPTA to implement ZeroEyes A.I. gun detection system on its Broad Street and Market-Frankfort lines.
Two SEPTA Transit Police Officers patrol Center City regional stop. SEPTA to implement ZeroEyes A.I. gun detection system on its Broad Street and Market-Frankfort lines.

Does AI gun detection work?

ZeroEyes was founded by a group of veterans after the 2018 mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida. It’s now operating in 42 states, inside schools, government buildings, religious institutions and private businesses.

As mass shootings proliferate across the country, ZeroEyes pitches itself as a proactive solution.

Several states have started to offer grants to public schools to implement the software. At least four districts in Western New York have partnered with ZeroEyes, mostly in the Buffalo area.

But there is little to no peer-reviewed research showing that AI gun detection is effective at preventing shootings. And other programs that use AI to scan backpacks and purses for weapons have missed them entirely or mistaken binders and spiral notebooks for guns and knives.

Jake Wiener, an attorney with the Electronic Privacy Information Center who specializes in surveillance oversight, said when AI weapon detection programs raise false alarms, they end up causing harm rather than preventing it.

"In that case, the system itself is creating an incredible amount of danger," he said. "Priming police for an active-shooter situation that does not exist and subjecting students and the public to unnecessary fear, trauma and the risk of police shootings."

Wiener called these programs "security theater exploiting legitimate fears," taxing already cash-strapped school systems and public venues without providing true solutions. At best, he said systems like ZeroEyes might facilitate a faster response to a scene and provide some sort of harm reduction ― but they shouldn't be seen as a tool to prevent violence in the first place.

"None of these systems actually prevent shootings by addressing the root causes of harm like emotional distress and easy access to firearms," he said.

Jason Stoddard, chairperson of the National Council of School Safety Directors, told The Associated Press earlier this year the money put toward the pricey technology may be better spent on other school safety efforts, like electronic door locks, shatter-resistant windows, communications systems and staff.

“The artificial-intelligence-driven weapons detection is absolutely wonderful,” he said. “But it’s probably not the priority that 95% of the schools in the United States need right now.”

What's next in AI surveillance?

ZeroEyes argues it is not a surveillance system. It does not include many of the features that commonly raise concerns with privacy advocates: Technicians don’t have access to a live video feed, the software is not trained in facial recognition and ZeroEyes cannot search by geometric data.

“We just want to know when there’s a gun,” Alaimo said.

The ZeroEyes cofounder said the company can’t quantify exactly how many shootings it was able to prevent, but he challenged skeptics to consider the alternative.

“We’ve had thousands of detections, many of them real, many of them detections like this which appear to be guns but are, in fact, not actual guns loaded with ammunition,” Alaimo said. “So, I understand the concern there with the dispatch of what ended up not being a real gun. But I do encourage those who ask about that to think about the real scenarios where we do get the real gun.”

But Wiener, the attorney from EPIC, believes as more AI tools emerge, the next phase of surveillance will involve combining disparate data from these different sources into a single framework that attempts to "make more predictions about how you might behave and whether you are 'dangerous.'"

"The advantage that AI offers in terms of surveillance isn't in accuracy," Wiener said. "It's in ease of use. It's now easier to analyze large amounts of data and pull out predictions, analyses, etc., but there's no real guarantee that the quality of predictions will get better... The upshot of these systems is continued harm and discrimination with a veneer of objective analysis."

— The Democrat and Chronicle is examining surveillance efforts in western New York as part of an investigative project called "Eyes on Us." Do you have questions about police or government surveillance? Email us at kcanne@gannett.com and we will try to answer them in an upcoming series of stories.

— Kayla Canne reports on community justice and safety efforts for the Democrat and Chronicle. Follow her on Twitter @kaylacanne and @bykaylacanne on Instagram. Get in touch at kcanne@gannett.com.

This article originally appeared on Rochester Democrat and Chronicle: Does AI gun detection work? College goes into lockdown over prop guns