How cute: techies think "smart guns" will solve problems, instead of create them.

One of the more interesting recurring characters in the post-apocalyptic television show Revolution is that of Aaron Pitman, portrayed by actor Zac Orth. Pittman was a soft and sensitive computer whiz who blew through MIT on his way to becoming incredibly rich as a top executive at Google… and then the lights went out.

Advertisement

pittman

With the electric grid crippled, Pittman discovers that all of his money, knowledge, skills, and technology are useless. He’s a “Silicon Valley guy” who never found a problem he couldn’t program his way out of, until he was forced out of his cocoon into a world he didn’t understand.

I’ve got a gut feeling that tech investor Ron Conway and entrepreneur Jim Pitkow have a lot in common with Pittman.

The two men are backing an initiative to create “smart guns” to reduce gun violence. Yes, they actually seem to think it will work:

In response to the Sandy Hook Elementary School shooting that left 20 dead, several prominent tech investors have launched the Smart Tech Challenges Foundation, awarding entrepreneurs $1 million to develop gun safety technology in weapons.

“Technology has been proven to solve today’s greatest social challenges, and curbing gun violence in this country is one of the greatest challenges we face,” investor Ron Conway said.

Conway, who has invested in a number of technology companies, including Facebook and Twitter, joined long-term entrepreneur Jim Pitkow to launch the initiative for safer firearms. The group is accepting proposals for better safeguards through the end of March.

A panel of judges will award funding to the teams with the best idea. The foundation will also provide resources and guidance to the winning teams.

Pitkow made it clear the initiative would focus on gun safety rather than gun control.

“In no way do our efforts challenge the right to bear arms,” he said.

Most of the developers who have entered the competition are looking to put biometric technology on weapons, such as voice recognition or palm print scanners, which would only allow authorized people to fire them.

Advertisement

I admire the stated philosophical goal of the Smart Tech Challenges Foundation, if the actual goal is to create a safer world.

The problems they face, however, are legion.

Biometric technology, which seems to be the preferred avenue of those submitting designs, works great in a sedate, lab-like environment where conditions are carefully controlled and science can be replicated. The real world doesn’t operate like these carefully-controlled fantasy worlds.

Voice-activated biometric locks calibrated under normal conditions will be useless under high-stress conditions. One thing learned during the pre-trial testimony during the George Zimmerman case is that experts stated that there was no way to know who was screaming for help on the record 911 calls. High stress changes how people sound in unpredictable ways, typically making them much higher pitched (ever heard someone mocked for “screaming like a little girl?”), and rendering others absolutely mute with fear. “Smart guns” using voice biometric locks calibrated during non-stress events will fail, and they will get people killed.

Palm-print and fingerprint scanners are interesting in theory… and all but useless in practice.

They are compromised by the weather anywhere north of Atlanta during the winter, and during the coldest parts of the fall and spring, or any other time it is cold enough that people wear gloves.

The technology is useless in both high risk and routine law enforcement. This includes events as innocuous as routine vehicle searches (where law enforcement officers typically wear cut-proof and stab-proof gloves to avoid being stuck with hidden hypodermic needles or knives), and during any sort of known high-risk scenarios involving SWAT/ERT units. It is standard operating procedure for SWAT/ERT units to wear full-body covering uniforms, including gloves designed for impact and/or flash/heat/cat protection, as shown in the following photos.

Advertisement

SWAT.+Suspect+terrorist+cat_eee12b_4727580

0419_boston-shooting-police-swat_650x455

maryland cops

Such technology relies upon an clean, dry, stationary and naked palm; a person who is wearing gloves, falls in dirt or mud, who is cut, wet, or moving (which never happens when a bad guy is trying to kill you with his own weapons), will go for “bang” and likely get a “click.” This is mildly disconcerting if the aggressor is not similarly encumbered by similar “smart guns.”

There are the problems of supplying power to such firearms (“I’m sorry Mrs. Officer Friendly, but your husband died because the battery died in his Goonblaster 4000 when he needed it the most”), of introducing complex and fragile technologies where it isn’t needed (“I’m sorry Mrs. Smith, but your daughter was raped and murdered because a control chip manufactured by a nine-year-old in a Chinese sweatshop failed”), and of course, the very high likelihood that any weapon using such technology would be hacked by unscrupulous, lawless governments and subject to being rendered useless with the flip of a switch (“I’m sorry, Sgt. Smith, but the control chip in your M4 manufactured by a nine-year-old in a Chinese sweatshop was switch off by their government prior to the invasion. I hope your bayonet is sharp”).

There are, of course, an entire stampeding herd of other problems with the theory, foremost among them being the problem of acceptance. Versions of “smart gun” technologies have been attempted for 50 years, and have universally been rejected by the military, law enforcement, and the general public. They simply are not wanted, except by those who don’t actually have to depend upon firearms to defend their lives, and the lives of those they know.

Advertisement

Join the conversation as a VIP Member