Is Gun Detection Tech A Privacy Concern Or Just Hype

On Friday in Sacramento, California, two people were shot and killed inside a shopping mall. One of my biggest fears is a mass shooting on Black Friday, and it was almost realized. The suspect gunned down two brothers, then ran. He managed to evade police for the weekend before being arrested.

Unfortunately, this is just how things go. California’s plethora of gun control laws were insufficient to stop this attack, though, which isn’t overly surprising.

However, a company claims their technology could have helped potentially prevent the attack.

Dave Fraser, CEO of Omnilert, talked with FOX40 about Gun Detect, a technology that might have warned mall security staff of a dangerous situation before any bullets were fired.

“It’s the industry’s first visual gun-detecting system, and it works by artificial intelligence. It takes an existing surveillance system … and it simply monitors it with artificial intelligence instead of human beings. It’s capable of recognizing all sorts of different guns,” Fraser explained.

With 3D printing technology, a gun could be plastic, but Fraser said that Gun Detect detects if an object moves as a gun as well.

This is so objects such as water hose attachments and cellphones are not mistaken for guns.

“We go through a three-step process,” Fraser said, explaining that the company is training the technology, so it looks for people nearby and an object that looks and moves like a gun, or a person wielding one.

The technology takes about four seconds to detect a gun and then that alert is sent to a safety team for “immediate action,” he added.

There are questions, though. For one, does this technology actually work?

So far, I haven’t been able to find anything that doesn’t look like marketing hype amplified by a hoplophobic media, so who actually knows? It might work, it might not.

If it doesn’t really work, then it’s a waste of money that won’t actually accomplish much of anything.

If it does, though, then it might represent something very different. It might represent a privacy concern.

Look, I’m not talking about people using it in places that are off-limits for lawful firearm carry. While I disagree with the idea, there’s at least an argument to be made for deploying such software as part of their security efforts in such a location.

The problem is that there’s no limitation on such software only being used in places like that.

As such, other places may start using this and being treating law-abiding gun owners as criminals simply because they opted to exercise their Second Amendment rights. If the software flags someone as carrying a gun, it can’t really tell you if that person has a permit or not. That means security may start treating anyone lawfully carrying a gun as if they’re a threat.

Then think about what it might be like for black men and women carrying a firearm. Studies tell us they already get more scrutiny from security than white men and women. Couple that with lawfully carrying a gun and I can’t imagine it’ll be a fun experience for anyone involved.

See, the problem with this technology is that it’s predicated on the idea that a gun is somehow the real problem, that anyone carrying a firearm should be considered a threat. That’s really not the case, but too few people seem to understand that.

So, let’s just hope this is all hype and that the system isn’t really that viable. For all our sakes.

Dec 04, 2021 11:30 AM ET