“Killer Robots” is a dreadful name, don’t you think? It reminds you of the killing machines in the “Terminator” series and the “Battle Droids” of “Star Wars.” “Lethal Autonomous Weapons Systems” is a much classier name , and the acronym is even better: LAWS. So the international conference that opened at the United Nations Geneva office on Monday is about LAWS.

Don’t think “drones” here. Drones loiter almost silently, high in the air above your picnic, until the operator back in Las Vegas decides that you are plotting a terrorist attack and orders the drone to kill you and your family. But at least there is an operator, a human being in the decision-making loop.

With LAWS, there isn’t. The machine sorts through its algorithms, and decides on its own whether to kill you or not. So you’ll probably be glad to know that there are no operational machines of that sort — yet. But military researchers in various countries are working hard on them, and they probably will exist in ten or twenty years.

Unless we ban them. That’s what the conference in Geneva is about. It’s a meeting of diplomats, arms control experts, and ethics and human rights specialists who, if they agree that this is a real threat, will put it on the agenda of the next November’s annual meeting of the countries that have signed the Convention on Certain Conventional Weapons. So it’s early days yet, and there’s still a chance to nip this in the bud.

That’s an awkward name, but not nearly as clumsy as the full name: the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects. But it actually has done some good already, and it may do some more.

Protocol 1 bans “The use of weapons the primary effect of which is to injure by fragments which are not detectable by X-rays in the human body.” Protocol II requires countries that use land mines to make them deactivate automatically after a certain period. Protocol V, added in 1995, prohibits the use of blinding laser weapons.

The world would be a worse place if they did not exist. They do exist, and by and large they are obeyed. But none of these weapons would make a decisive difference in actual battle, whereas they cause or would cause great human misery, so it was easy to ban them.

The problem with killer robots is that they could make a decisive difference in battle. They don’t get tired, they don’t get paralysed with fear, and if you lose them, so what? It’s just a machine. There’s no person in there. But that’s precisely the problem: there’s no person in there. Do you trust the machine to make decisions about killing people – who’s a soldier and a legitimate target, who’s an innocent civilian – all by itself?

Killer robots are a very bad idea, but let’s not get romantic about this. Wars involve killing people, and whether you’re doing it with live soldiers or Lethal Autonomous Weapons Systems, it’s never going to be morally tidy. The real worry is how much easier it would be for a technologically advanced country to decide on war if it didn’t have to see lots of its own soldiers get killed.

So by all means let’s ban purpose-built killer robots if we can: this is an initiative that deserves our support. But bear in mind that there will almost certainly be autonomous machines eventually, and some of them will certainly be capable of killing. So it is also time to start working on international rules governing their behaviour. Isaac Asomov’s Three Laws of Robotics (written in 1942) would be a good point of departure.

One: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Two: A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

Three: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Gwynne Dyer is a London-based independent journalist whose commentary is published in 45 countries.

Leave a comment

Your email address will not be published. Required fields are marked *