New gun-detection system uses Wi-Fi to sense concealed weapons
After testing the technology last spring, a Pittsburgh-area school district this week began to deploy a weapons-detection system, across all of its buildings, that uses artificial intelligence and a cutting-edge Wi-Fi sensing technology to spot guns, even if they are concealed.
Administrators at the Chartiers Valley School District, southwest of Pittsburgh, said the system will help keep students safe as they come through school entrances. District Superintendent Daniel Castagna said his district is the first to have this type of technology on campus after signing a five-year contract with the company last month.
The technology is being installed across the district’s four school buildings: two elementary schools, one middle school and one high school. The district plans to monitor two entrances per building, for a total of eight devices.
But civil liberties groups warn the new technology could normalize routine surveillance and do the opposite of protecting students. They also worry the software could misidentify threats and expose data without consent.
How it works
The technology, which the company refers to as Wi-AI (a combination of Wi-Fi and AI), was developed by CurvePoint.ai, a startup spun out of Carnegie Mellon University. The company’s proprietary device hooks into existing Wi-Fi networks, and in addition sending out its own Wi-Fi signals, uses the radio waves to create a 3D field. The waves interact differently with different objects, allowing the technology to “see” through jackets, bags and other types of materials by penetrating them in search of items made of metal.
The district hopes the system will produce fewer false alarms than traditional metal detectors. The company has layered an AI algorithm into the technology that can analyze the 3D field, ingest how the Wi-Fi signals are bouncing off or refracting through people and objects, and then make deductions from the shapes and materials of certain items.
“We build our devices that are meant to send Wi-Fi signals in the direction of where we’re trying to detect something, and so we try to measure all the different iterations of it, bouncing off different surfaces and going through the person, through the bag,” said Devin Ulam, CurvePoint.ai’s chief technology officer. “Then we can, depending on what’s in there, we can train a model to understand what a gun looks like in different circumstances. … And so putting all that together, we get to a solution that ultimately can determine if there’s a threat or a concealed threat within the perspective of the device.”
The 3D field can also “see” things or people that are out of visual sight from the device, such as behind a desk or a wall. So far, the company said, the model has been trained on a large volume of data to distinguish between the reflected signal patterns from people, guns and other items.
“What’s remarkable is that this particular frequency, this particular transmission medium, is out in the world already,” said Skip Smith, CEO of CurvePoint.ai. “That’s a big thing for us, so it’s ubiquitous. We’re already all being bombarded all the time with Wi-Fi signals, so that’s what makes it really, really exciting for us.”
While the technology has the capability to be deployed in a number of areas throughout the school, Smith said, so far, the technology focuses on monitoring school entryways. And though it’s currently optimized to detect people and guns, the company is working on sharpening the tech’s detection ability to pick up smaller or less-distinct items, like knives or vapes.
‘Abysmal failure’
The company claims the technology does not capture images or personal identifiers, unlike camera-based systems, and instead only gathers whether a threat is detected. The company said this mechanism helps protect student privacy, which is a common concern raised about “smart” threat-detection systems used in some schools.
Chad Marlow, a senior policy council at the American Civil Liberties Union, said it’s not the risks to student privacy that concern him, but that weapons-detection software generally has not worked as promised in a number of recent, high-profile instances.
“This is just another attempt by a surveillance company to fleece our schools by capitalizing on their fear of an incident in schools when, in fact, they really can’t do much about it,” Marlow told StateScoop. “They, in fact, create a problem where one doesn’t exist.”
In January, an AI-powered gun detection system from the company Omnilert failed to stop a school shooting in Nashville, Tennessee. Investigators determined that the system missed the handgun the 17-year-old shooter carried into the school building because of where cameras were located. A 16-year-old girl was shot to death and another 17-year-old was injured before the shooter took his own life.
And then just last month, an Omnilert system installed at a school in Baltimore County, Maryland, falsely warned school officials and law enforcement that a student had a gun when the student only had a bag of chips. The false alert propelled local leaders to call for a review of the technology, noting that the false alarm had traumatized students.
“Weapon detection software has been an abysmal failure. It’s an abysmal failure because of its false positives,” Marlow said. “It puts students at risk of someone being told they have a weapon on them and are an immediate danger to everyone around them, which can elicit a response from the police that could put not only that kid, but other kids in the school in danger. On the other side, it creates a sense of false security, that a weapon will be detected when it often isn’t, and even if it is detected, there’s not enough time to do anything about it.”
CurvePoint.ai’s Smith said that his company’s technology is not meant to be a “silver bullet” to solve the problems underlying school shootings. But, he said, the Wi-AI system should be thought of as an added layer to existing school security infrastructure, such as cameras and “smart” door locks.
As for the concerns with a potentially traumatizing law enforcement response, he said the system “quietly” alerts select staff, allowing the school to discreetly respond and intervene without making the environment feel like a fortress. Castagna, the district superintendent, said in his district’s case, the system would notify school resource officers, himself and another designated custodian.
The future of Wi-AI
Beyond the Chartiers Valley School District, Smith said CurvePoint.ai is in talks with about a dozen other schools across three or four districts, and estimated they’ll be testing by the end of this year. And the more schools that try it out, he said, the better the technology will become at detecting firearms, as it ingests more “frames” — or 3D fields — of data.
“This initial group or cohort, we’ve committed that we’ll be 95% accurate on weapon detection. … I think it’s less than a 4% false-positive rate, but we think the model can get to 99% effective. It just needs more training,” Smith said. “What we’re doing with our initial cohort of schools is we’re saying, listen, sign a contract for us and let’s set the acceptance criteria. If we get there, that’s wonderful. You can bring our system online in your school, we’ll protect your kids. If we don’t get there, it’s no risk to you. … We’re searching for schools that want to be innovators in school safety and want to work with some new Carnegie Mellon-inspired artificial intelligence technology.”
Marlow, at the ACLU, said a 99% efficacy rate is “literally not possible.”
“I would love, love, love, love to see an independent examiner conclude they are 95% accurate. I would love to see that,” Marlow said. “But I’ll say this to them, and I’ll make this challenge right now: you’re trying to sell a product to the general market, so invest some money. … Find someone outside of Carnegie Mellon who’s not looking to make money off of this product to independently evaluate, put all your data out there for the public, and let’s test it and see your 95%.”
Smith said that while technology has not been independently audited, as it is still in the testing phase, the company would “enjoy the opportunity to discuss an independent review in April after we have the system online in March.”
For the district superintendent, the urgency in preventing another school shooting outweighed questions of accuracy or vetting.
“I’ve been a superintendent since 2011 and have lived through many school shootings across the country, right? And every time a school shooting happens, we are inundated with sales calls, with people trying to sell us either door locks or cameras or some alert system, etc.,” Castagna said. “This was the only school security system that was ever proposed that was proactive and actually stops violence from happening. … Nothing talks about stopping a mass shooter before it happens.
“One of the things that [CurvePoint.ai] also presented to us when they were pitching this idea, they exposed to us that the research shows that 90% of mass shootings, the mass shooters have admitted to having the weapon on campus multiple times before they carried out the shooting. And so this that would have been multiple opportunities for this system to pick something up before something would happen.”