It's going to be standing room only at United Nations Convention on Certain Conventional Weapons (CCW) in Geneva today. On the agenda: killer robots, or as the U.N. calls them "LETHAL AUTONOMOUS WEAPON SYSTEMS." It will be an informative meeting with lethal, autonomous weapon specialists. The goal will be for U.N. members to learn as much as possible about the potential threats ... and then ban them, at least temporarily.

CCW, and especially this meeting, exists to "ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately." Some of the types of weapons include explosive remnants of war, blinding lasers, mines, booby traps, and yes, autonomous robots.

The move to ban robot soldiers is not extremely surprising, as there are some major ethical and technical issues with having a massive piece of machinery operate either on its own or through a series of machines in a war zone. Noel Sharkey, a robotics professor at the University of Sheffield, is aiming to keep robots out of war completely. He told NBC News“My biggest concern is that when every nation has this technology, we will start seeing the full automation of warfare.”

Sharkey is also worried about the inequality of war: though it already exists in terms of weapons, killer robot soldiers would take this inequality to the next level.  “Some people propose that this technology will save our soldiers’ lives because we will send in machines to do our fighting for us,” he said. “But that only works if the other side does not have machines, because they will send them in to kill our soldiers.” This inequality could open the door to a new arms race, with every country aiming to build the biggest, baddest and most lethal robot as quickly as possible. 

In the event of a robot arms race, we could be left with a variety of different, contradictory and competing systems. Sharkey reminds us that machines are not perfect, they can be unpredictable. With robotic competition, “Nobody knows really at all how these different systems will interact with each other,” he said.

The International Committee of the Red Cross has also weighed in on the issue, noting the ethical problems with robot soldiers: "There was sense of deep discomfort with the idea of allowing machines to make life-and-death decisions on the battlefield with little or no human involvement.'”

While many are in support banning robots outright, Professor Ronald C. Arkin at Georgia Institute of Technology believes they could be used to reduce casualties if used correctly. Arkin is adamant that these systems do not have "free will or moral agency" (at least not yet) so they will be able to "potentially comply with international humanitarian law as well or better than human war fighters.” He believes that in order to allow robots in combat, they would have to be specifically programmed only for particular scenarios and to behave extremely strictly in these scenarios. 

Experts agreed that while the U.N. works to figure out the details that will govern the use of potential robot soldiers, they should be banned.