Security cameras are often the first witnesses to school shooters — yet their insights are typically sought only after a tragedy occurs, since most schools don’t have anyone monitoring them in real time.
But an AI system could soon be peering through cameras to watch students in some Utah schools, scanning for brandished guns.
The Utah State Board of Education recently awarded a $3 million contract to AEGIX Global, a Salt Lake City-based security software company that is the statewide reseller for ZeroEyes, headquartered near Philadelphia.
Utah public schools will be invited to seek grants to install ZeroEyes’ AI software, which detects visible guns.
The software works by layering on top of a school’s existing cameras. If the AI software detects a possible gun, the company says, images will instantly be shared with a ZeroEyes operations center; there is one outside Philadelphia and one based in Hawaii. The company says they are staffed 24/7 by U.S. military and law enforcement veterans.
If they determine a threat is valid, they dispatch alerts and information to local staff and law enforcement in three to five seconds, including a visual description, gun type and last known location, according to ZeroEyes.
State school board officials declined to speak with The Salt Lake Tribune about how the grant program will work until the state’s “procurement processes are finalized.” That should be sometime in the fall.
The $3 million in funding is enough to cover every school in the state, a spokesperson for AEGIX said. That company will install the ZeroEyes system and provide training and support to schools.
Who is behind ZeroEyes?
In 2018, frustrated by what they felt was a lack of response to repeated school shootings across the country, four former Navy SEALs quit their jobs and pooled their money together to create ZeroEyes.
“We started this company after Parkland,” said Rob Huberty, co-founder and chief operating officer of ZeroEyes.
On Feb. 14, 2018, a shooter killed 17 people at Marjory Stoneman Douglas High School in Parkland, Florida. It was one of the deadliest high school shootings in U.S. history.
“We watched that scenario play out like the rest of America did, and it’s one of those things: Why does this keep happening? The same thing again and again?” he said.
With the Parkland shooting, Huberty said, police and media shared more video footage than was typical.
“It was heartbreaking to see that the police were looking at video cameras that were rewound 20 minutes,” Huberty said. “It was heartbreaking to see that he was walking around with a gun and could have been caught had somebody been looking at [the cameras].”
ZeroEyes has grown from its first beta testing site at Rancocas Valley Regional High School in New Jersey to hundreds of schools across 37 states, according to the company. Huberty said he did not want to disclose the exact number of schools for security reasons.
“Our goal is to really be in every school in America that we can possibly be in,” he said, to protect students and staff.
The question of competition
Offering firearm detection software for security cameras is a new initiative for the state. The program was established by HB61, which passed during the most recent legislative session. Among other measures, it authorized the Utah State Board of Education to administer $72 million in grants for school safety efforts.
It also allocated an additional $3 million specifically for software to detect “the presence of visible, unholstered firearms” that has been designated as anti-terrorism technology under the federal SAFETY Act.
The SAFETY Act, which stands for Support Anti-Terrorism by Fostering Effective Technologies, was passed in 2002 following the 9/11 attacks. It provides liability protections to organizations and individuals involved in the development, deployment and use of qualified anti-terrorism technologies.
ZeroEyes touts its status as the only AI-based gun detection video analytics software to hold this designation — causing competitors to question the fairness of Utah’s law.
Mark Franken, vice president of marketing for Omnilert, a ZeroEyes competitor based in Leesburg, Virgina, said “competition is always good, especially when we’re talking about school safety. Having an open, fair competitive process should ultimately bring about the best technology and the best solution.”
Omnilert notes that Utah native Chad Green developed its visual AI gun detection technology; his cousin’s daughter, Emilie Parker, was killed at Sandy Hook Elementary in Newtown, Connecticut, and was buried by her family in Ogden.
Franken said the company is working to become SAFETY Act designated, but it is a process that can take years. The liability protection it would provide “has a benefit to us,” he said. “But in terms of, does that say one solution is better than another in terms of the customer? It really has no bearing.”
A spokesperson for AEGIX said the company could not comment on how lawmakers worded the bill, but provided a statement about its designation.
That process, it said, “required over two years of pressure testing by the [Department of Homeland Security] to ensure that ZeroEyes’ technology is sound, reliable, and delivers what they promise: to detect illegally brandished guns and dispatch alerts to local law enforcement in as fast as 3 to 5 seconds from detection.”
What about child privacy?
With any AI video software, there are questions about privacy. In schools in particular, that means the privacy of hundreds of thousands of minors.
Omnilert points out that its system differs from ZeroEyes, in that schools can choose whether images are reviewed by Omnilert operators or by school personnel.
Huberty, with ZeroEyes, explained that its operation centers “never get to see a camera at any moment other than when the AI thinks it’s found a gun. It’s like a snapshot in time.”
Detected images are stored in an encrypted format, the images are not personally identifiable, and they are never sold, he said.
ZeroEyes must adhere to the federal Family Educational Rights and Privacy Act, or FERPA, which protects the privacy of student records. Videos recorded by schools are considered educational records and under most circumstances may not be released to the public.
“We are FERPA compliant,” Huberty said. “We want to protect everybody’s data, everybody’s information.”
False identifications do happen
The AI software can “see” as well as the human eye, but it can’t always distinguish the difference between real guns and fake guns, Huberty said. That’s why staffers review the images; if the AI mistakes a leaf blower for a gun, they can disregard it before any calls to authorities are made.
The software has so far identified BB guns, Airsoft guns and water guns at schools. Airsoft guns typically, but don’t always, have orange tips to distinguish them from real firearms. Guns on school resource officers and dummy guns for Junior Reserve Officers’ Training Corps are also frequently detected. However, operators can easily determine these scenarios are not threats, Huberty said.
Huberty said it’s rare that operators are not able to determine if an image shows a real gun, but in those cases, they will err on the side of caution and let school officials know.
This spring, five Utah schools were the target of false shooting reports, leaving students, teachers and communities traumatized. Calling in a hoax threat could soon be considered a felony, Utah lawmakers announced last week.
In several situations where police arrived at a school in response to a false threat, ZeroEyes has been able to provide clarity, Huberty said. “We’ve been able to probably de-escalate a handful of those scenarios,” he said, “because we don’t see any other guns other than police guns.”