June 3, 2013

Military robots not the world’s biggest worry

One of my daughters once proposed that my T shirt should read: “I don’t support war, but war supports me.” And it’s true.

So you might assume that I would leap into action when I learned that almost 3,000 “researchers, experts and http://www.cheapjerseys11.com/ entrepreneurs” have signed an open letter calling for a ban on developing artificial intelligence (AI) for “lethal autonomous weapons systems” (LAWS), or military robots for short. Instead, I yawned.

The people who signed the letter included celebrities of the science and high tech worlds like Tesla’s Elon Musk, Apple co founder Steve Wozniak, cosmologist Stephen Hawking, Skype co founder Jaan Tallinn, and Noam Chomsky. They presented their letter in late July to the International Joint Conference on Artificial Intelligence, meeting in Buenos Aires.

They were quite clear about what worried them: “The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.”

“Unlike nuclear weapons, they require no costly or hard to obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass produce. It will only be a matter of time until they appear… in the hands of terrorists, dictators wishing to better control their populations, warlords wishing to perpetrate ethnic cleansing, etc.”

“Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity.”

Well, no, it wouldn’t be beneficial for humanity. Few arms races are. But are autonomous weapons really “the key question for humanity today”? Probably not.

We have a few other things on our plate that feel a lot more “key,” like climate change, nine civil wars in the Muslim parts of the world and, of course, nuclear weapons.

We don’t really need yet another high tech way to kill people. But autonomous weapons of the sort currently under development are not going to change the world drastically. They are not “the third revolution in warfare, after gunpowder and nuclear arms,” as one military pundit breathlessly described them. They are just another nasty weapons system.

What drives the campaign is a conflation of two different ideas: weapons that kill people without a human being in the decision making loop, and true AI. The latter certainly would change the world, as we would then have to share our world for good or ill with non human intelligences but almost all the people active in the field say that human level AI is still a long way off in the future, if it is possible at all.

As for weapons that kill people without a human being choosing the victims, those we have in abundance already. From land mines to nuclear tipped missiles, there are all sorts of weapons that kill people without discrimination. We also have a wide variety of weapons that will kill specific individuals (guns, for example), and we already know how to “selectively kill a particular ethnic group,” too.

The thing about autonomous weapons that really appeals to the major military powers is that, like remote piloted drones, they can be used with impunity in poor countries. Moreover, like drones, they don’t put the lives of rich country soldiers at risk. That’s a good reason to oppose them and if poor countries realize what they are in for, a cheap jerseys good opportunity to organize a strong diplomatic coalition that wants to ban them.