Premium

Kansas Contracting AI System to Detect Guns. That Might Be a Bad Idea

Artie_Navarre / Pixabay

Technology offers a lot of opportunities. However, no technology is without its teething problems. It's one reason I'm not generally what people might call an "early adopter."

But some states are full of early adopters, especially when they can point to technology as a way to claim they're taking safety seriously.

And, to be fair, maybe they are. They may well believe new artificial intelligence-powered gun detection systems are viable, that the technology is good enough to be of benefit.

Either way, lawmakers in Kansas are looking to drop a bundle on just such a system.

The Kansas Legislature’s special budget committee revealed the attorney general’s office continues to work toward awarding a $10 million contract to a private vendor for deployment of artificial intelligence and use of security cameras to spot gun-wielding intruders in public schools. 

Gov. Laura Kelly signed a budget bill in April that included funding for technology that promised to identify unconcealed firearms in school buildings so alerts could be forwarded to school staff and law enforcement officers. In October, three companies submitted bids through the Kansas Department of Administration. On Nov. 26, a spokeswoman for Attorney General Kris Kobach said a contract hadn’t been issued.

Under state law, the attorney general’s office was directed to make certain the system was installed in schools by the end of December and operational at the close of February.



Sen. Pat Pettey, a Democrat from Kansas City, Kan., said the initiative was embraced by the Legislature at the same time a popular program offering state-funded school security grants to districts was dropped.

“We no longer have funding in the Department of Education for safe and secure schools,” Pettey said. “But we do have this funding source … but nothing has been appropriated? Schools would like to have safe and secure funds, but they don’t have any option now except through this program.”

Bids on the Kansas firearm-identification contract were submitted by Gades Sales Co. of Wichita; CIS Data Services of Springfield, Mo.; and ZeroEyes of Conshohocken, Penn. Financial details of those company proposals haven’t been made public.

The upside here is that none of these companies is the outfit that turned the police out on a Maryland kid with a package of Doritos.

Another upside is that now there's at least some competition. In 2024, when all of this first started, ZeroEyes was the only company that met the requirements to even bid. That's never a good thing.

However, the issue I have is that while these aren't the company that was involved in the Maryland incident, artificial intelligence still has a lot of issues. How many more cases like this will we hear about before the issues are resolved?

I don't know, and while I'm open to anything that can help keep kids safe without interfering with our right to keep and bear arms, AI tools like these identification systems need to be tempered better than they were in Maryland. A person needs to be in the loop, at least for the foreseeable future, just to make sure we don't see a police response for a false positive.

Of course, we still don't know how many, if any, false negatives there are. Do these systems detect guns 100 percent of the time when one is present? Obviously, the companies vying for these contracts will say they do, or close to it for legal reasons, but is that accurate?

My problem is that people tend to rely too much on technology once it's available, even when it's still not really dialed in. Look at the reports of students relying on AI to help them with schoolwork. I can ask three different AI programs a question, get very different answers, and gaslight all of them into telling me what I want to hear.

While that's less of an issue with gun detection, I still have concerns, and the Doritos incident just made them that much more pronounced.

Sponsored