Will New York’s AI Weapons Detector Evolv Solve Problems or Create New Ones?

cover
31 Jul 2024

The City of New York recently announced it would begin implementing the use of AI weapons detectors, named Evolv, on the city’s subways to screen passengers.

“I am telling them they need to evolve and get it out, because I think it’s good technology,” Mayor Eric Adams said at his weekly press availability on Tuesday. “I’m really excited about this, just the potentiality that we could identify someone carrying a gun before they enter the system.”

But the city’s residents say they still don’t feel safe.

As described by an NBC report, the scanners, about 6 feet tall, feature the logo of the city's police department and a multicolor light display. When a weapon is detected, an alert is sent to a tablet monitored by a pair of NYPD officers. The system is not supposed to alert everyday items, such as phones and laptops — though a reporter's iPad case set it off Friday.

Civil liberties advocates also protested against the scanners. The New York Civil Liberties Union and the Legal Aid Society said they would sue the city if the technology is rolled out on a wide scale.

The NYCLU believes the technology will make subway riders subjected to unconstitutional searches by the police.

The system has only been installed in several locations so far.

But with 472 subway stations throughout New York City, the largest number of any system in the world, this lack of Evolv AI detectors in every station means the subways are not actually being kept safe.

Though there have been high-profile incidents, like a 2022 shooting on a Brooklyn train that left 10 people wounded, crime in the New York City subway system has fallen in recent years.

What this means is that overall, violent crime in the system is rare, with New York’s subway system being generally as safe as any other public place.

As of July 21, subway crime is down 8%, compared with the same period in 2023, according to police data. Last year, there were five killings in the subway, down from 10 in 2022, according to police.

AI weapons detectors are useful since they can identify potential threats more quickly and accurately than human personnel. Unlike human guards, AI systems do not suffer from fatigue, distraction, or bias, ensuring consistent performance and vigilance.

On the flip side, the same detectors can misidentify harmless objects as weapons (false positives) or fail to detect actual threats (false negatives), potentially causing security lapses or unnecessary panic.

Critics say Evolv is problematic from a legal point of view.

Evolv can also have other shortcomings, such as being susceptible to hacking and other cyber-attacks, which could compromise the system’s effectiveness.

In addition, the initial setup and deployment of the detectors will likely be expensive, requiring significant investment in technology and infrastructure – all of which of course comes out of the city’s budget paid for by taxpayers whether they like it or not.

As NYCLU notes, the use of AI in security settings raises ethical and legal questions about accountability, especially in cases where the system makes errors or is involved in incidents resulting in harm.

It is clear that public acceptance of AI weapons detectors is going to be mixed, with some individuals feeling uneasy about relying on machines for safety and security decisions.

Hopefully, crime rates stay low and security levels remain high – with or without Evolv.