All technology comes with risks that offset the benefits. Chemical weapons are gruesome and have no conceivable benefits, so we ban them. Cars are dangerous but have practical benefits, so we allow them.
The newest category of concern is the AKM – the Autonomous Killing Machine.
There is nothing wrong with autonomous machines per se; it is the killing part that makes me nervous. Driverless cars are autonomous. You give them a mission profile, such as “take me to the office”, and they work out how to do it.
AKMs are equivalent, except for the crucial fact that their mission profile has lethal intent, such as “eliminate every human in a region”.
AKMs are a big step beyond the drones that attack targets and terrorists in Pakistan and other troubled regions. Drones are operated by pilots sitting in a control room far away – this means that humans are still weighing each decision to drop a bomb.
As it happens, American drone pilots have been resigning in large numbers, partly due to workload but also because they are traumatised by the video imagery of victims incinerated by their missiles.
That humane concern could be a deterrent for decision makers thinking about waging war. In contrast, AKMs will not quit, so it will be emotionally easier for generals to commit to a mission.
And it will be financially easier. It is cheaper to send a drone to Pakistan than a piloted plane. And if you make the pilot in the control room redundant the cost goes down even further.
It could be politically easier too. America lost heart in its wars in Vietnam and Afghanistan largely because voters were tired of seeing fellow citizens returned in body bags, maimed for life or mentally harmed. With AKMs waging war, that restraining influence is lost.
Maybe, just maybe, leaders will refuse the temptation to wage machine wars, but I don’t have much confidence in the ethical values of most leaders.
By lowering the emotional, financial and political costs, AKMs will increase the willingness of politicians and generals to pursue battlefield solutions.
The only countervailing factor will be ethics. Maybe, just maybe, leaders in advanced countries will refuse the temptation to wage machine wars, but I don’t have much confidence in the ethical values of most leaders.
If we agree that AKM warfare is immoral, can we ban it? We banned chemical weapons, biological weapons, nuclear weapons and cluster bombs, after they were used with dreadful consequences. More constructively, in one case at least, the use of a class of weapon was banned before its deployment. In 1998, the United Nations banned the use of lasers to blind enemy soldiers. The technology is easily available, but so far the ban has held.
Likewise, the UN should ban AKMs despite the difficulty of monitoring such a ban. One complication is that there is no technological difference between an AKM that follows a mission profile and shoots to kill, versus an autonomous machine that follows the same profile but waits for a human’s permission before shooting. That last step of waiting for permission, known as meaningful human control, could make the use of robots acceptable – and could also be bypassed in a heartbeat. One solution would be to require that the decision-making process be recorded to be reviewed for potential war crimes. A more extreme approach would be to ban autonomous target identification, regardless of whether a human makes the decision to shoot.
Of course, all governments participating in conflicts will be afraid that their enemies are developing such weapons. Such fear drove the development of nuclear weapons. Hungarian physicist Leó Szilárd won support from other scientists and politicians because he was convinced the Nazis must be working on a nuclear bomb. After Germany’s surrender it was clear there was no such project, but by then the Americans had developed their bombs. Despite knowing that nobody else had such weapons, they used them against Japan.
This time round scientists and engineers should take a more principled stance. And they are. In July 2015, leading international researchers in artificial intelligence and robotics called on the UN to ban autonomous weapons.
For the sake of us all, the UN must do so.