Earlier today Bearing Arms contributor Kerry Sloane wrote about the some of the problems with using AI systems to detect guns on school campuses. Kerry's argument is even more compelling after yet another botched effort by one of those systems that relies on artificial intelligence to determine whether someone is carrying a gun.
The latest incident happened in Oviedo, Florida on Tuesday morning, when Lawton Chiles Middle School went into a lockdown after the school's automated weapons detection system spotted what it determined to be a student holding a rifle. In reality, the kid was carrying nothing more than a clarinet.
In a message to families, school Principal Dr. Melissa Laudani said, “We have multiple layers of school safety, including an automated system that detects potential threats. A student was walking in the hallway, holding a musical instrument as if it were a weapon, which triggered the Code Red to activate.”
The lockdown was later lifted and all students and staff are safe.
“While there was no threat to campus, I’d like to ask you to speak with your student about the dangers of pretending to have a weapon on a school campus,” Laudani said.
I don't even know what Laudani meant by her description of the student holding the clarinet "as if it were a weapon." Did he have the clarinet up near his shoulder so he could down the barrel of the instrument? Was he carrying the clarinet at low ready?
Regardless, a clarinet looks as much like a rifle as a bag of Doritos looks like a handgun. A Bb clarinet is about 26 inches long, which is the same length as some rifles, and has a tubular design that vaguely resembles a rifle barrel (if you're squinting, anyway), but that's where the similarities end.
As it turns out, false positives like these are more common than I realized. A report from WBAL in Baltimore last month noted that Baltimore County school officials get false positives every single day that school is in session, but rarely review those incidents.
In Baltimore County, the Omnilert system detects objects and provides photos and a videos to six people, including school resource officers and school administrators, who ultimately make the call on whether to respond.
"There's a variety of people, and even that being said, we still take a look at our processes on a regular basis and see if there's an opportunity to make sure that everyone is clear on the reminder and practice and things of that nature," Baltimore County Public Schools Superintendent Myriam Rogers said.
Eight officers responded in the Kenwood incident to discover the alleged gun was actually a bag of chips. School officials said police responded to Kenwood because the principal didn't know the alert was canceled. The principal then contacted an SRO, who then called police.
A Baltimore County Councilman audaciously claimed that the Omnilert system "worked" when it flagged the Doritos bag as a gun, arguing that "the issue is tightening up on or making sure all the humans in the system know their job and what they are supposed to do."
Yes, any AI weapons detection system needs to have humans overseeing the system to flag those false positives, but the sheer number of these false positives shouldn't be overlooked or dismissed with a wave of the hand. As the president of the Maryland Fraternal Order of Police told Fox 45 in Baltimore in the wake of the Doritos debacle, "there's a difference between feeling safe and actually being safe.”
“That technology is so new, that we shouldn't be spending money, relying on a technology that hasn't proven itself to be as accurate as we need it to be,” stated [Sergeant Clyde] Boatwright.
False positives aren't the only issue with these systems. Earlier this year an AI weapon detection system at a Nashville school failed to send an alert about shots being fired, which resulted in a student being killed. And systems like Omnilert are looking for visible firearms, so any gun in a backpack or concealed on a student's body isn't likely to be flagged at all, which is a huge problem.
AI is no substitute for human eyes and ears, and I'm in agreement with Boatwright that these weapon detection systems aren't ready for primetime. They're far too likely to issue alerts when no gun is present, which in turn can result in needlessly hostile and potentially dangerous interactions between police and the student suspected of carrying a firearm. And given that systems like Omnilert can't identify weapons that are hidden away, I'd also argue they can lead to a false sense of security that can be exploited by students who do want to cause harm to staffers and their classmates.
Editor’s Note: President Trump and Republicans across the country are doing everything they can to protect our Second Amendment rights and right to self-defense.
Help us continue to report on their efforts and legislative successes. Join Bearing Arms VIP and use promo code FIGHT to get 60% off your VIP membership.

Join the conversation as a VIP Member