As military top brass warn of an obsession with “exquisite” equipment and the RAF opens up UK drone HQ, Channel 4 News asks if the next generation of battles will be fought not by man, but by machine.
You might think this picture looks unreal, like something out of a sci-fi future.
Actually, it’s nearly four years old. It’s from 2010, when BAE Systems launched Taranis, a prototype unmanned combat aircraft system.
Named after the Celtic god of thunder, Taranis is one of a host of machines being developed around the world which could be more controversial than their predecessors.
And that’s saying something when their predecessors are drones – a machine which has come to be synonymous with CIA strikes and civilian deaths in non-warzones such as Pakistan and Yemen.
These killer robots are evil in themselves. Professor Noel Sharkey
The UK military use drones too, in a manner which the UN has approved as “responsible”; five Reaper drones for surveillance and some strikes over Afghanistan. But, perhaps understandably, it would rather you didn’t call them drones, preferring – as Channel 4 News was told on a trip to UK drones HQ RAF Waddington this week – either unmanned aerial vehicles (UAVs) or remotely-piloted aircraft systems (RPAS).
That’s because drones suggests that the aircraft aren’t piloted, when they are. The machines are controlled by humans thousands of miles away but in almost exactly the same manner, and with the same rules of engagement, as if the pilot was in the cockpit of a fighter jet.
Read more: World of Terminator is coming, says RAF chief
This is where Taranis and its brethren come in: because it has the potential to be totally autonomous.
At the moment, it’s only potential. The Ministry of Defence is adamant that all of its drones are and always will be controlled by human pilots and BAE Systems points out on its website – in brackets – that Taranis will have a “human operator in the loop”.
But campaigners say these brackets, as well as a clearer definition of what exactly a human being “in the loop” means, are crucial.
They point to a meeting in November of a host of UN countries, at which the UK was the only one not to back an all-out ban on developing weapons of this kind. More talks will take place over a ban on developing what the UN calls “lethal autonomous weapon systems” – LAWS – next year.
Professor Noel Sharkey, University of Sheffield robotics expert and anti-robotic weapons campaigner, prefers the term “killer robots”.
He told Channel 4 News: “These killer robots are evil in themselves. You can’t predict their actions and you can’t control them.”
In parliament earlier this year, Alastair Burt accepted MPs’ concerns about the “terrifying” potential of these kinds of weapons. But he stressed: “As a matter of policy, Her Majesty’s government are clear that the operation of our weapons will always be under human control as an absolute guarantee of human oversight and authority and of accountability for weapons usage.”
But what level of control that means – from a human operator pressing the “go” button, to a lawyer selecting what targets are legitimate – has not been explained and must be clarified, says Professor Sharkey.
While the idea of a robot controlling life and death is a frightening one, without taking into account the potential for hacking and technological failure, an argument can also be made that bringing robots into battle can save human life.
It’s the same argument which has been made for drones.
As Peter Felstead, editor of military trade bible IHS Jane’s Defence Weekly put it to Channel 4 News: “It’s all about exposing the robotic system as opposed to a real person to the danger.”
Other robotic technology was born with this aim as well, including robotic exoskeletons which are currently being developed to be either controlled by, or actually worn by, soldiers on the battlefield – think The Matrix.
Read more from Channel 4 News on drones
There are also Explosive Ordnance Disposal (EOD) robots, designed to be sent in ahead of the human technician. Or grenades which can be thrown over a wall and which can then feed back pictures of what is happening there.
It is technology which is saving lives, the military argues, and it also happens to be of the intellectually appealing James Bond Q school of gadgetry.
Elizabeth Quintana of the Royal United Services Institute told Channel 4 News: “Yes, there will be robots for dirty or dangerous jobs. In Sweden at the moment, they have a system where six or seven lorries in a row will follow the leader. That’s commercial, but it’s being looked at by the military, because in Iraq and Afghanistan most of the deaths were logisticians, being blown up on the roads or shot at.”
It’s not an argument everybody buys: Lindsey German, the convenor of the Stop The War Coalition, said the problem is it isn’t just high-tech machines attacking other high-tech machines.
“If you thought the war was justified, you would want it carried out with as little risk to soldiers as possible. If not, you see more and more sophisticated weaponry being used against some of the poorest people in the world, and the consequences are terrible,” she said.
Why big business and the NSA sift through your data patterns>
Moreover, all of the technologies above remain linked to their human operators. They are not the conventional “killer robots”, on a destruction spree utterly separate to any human control or aim.
Maintaining that link is something which even military leaders are concerned about, mainly because they fear a focus on developing ever-more impressive technology at a time of constrained budgets is coming at a cost to funding and training actual human soldiers.
Last night, Sir General Nicholas Houghton gave his annual lecture, warning that: “Unattended our current course leads to a strategically incoherent force structure: exquisite equipment, but insufficient resources to man that equipment or train on it. This is what the Americans call the spectre of the hollow-force.”
Other experts agree that modern warfare means that the human face of conflict is more important than ever.
Ms Quintana thinks having technology tied to humanity is an unbreakable link, particularly in the “hearts and minds” campaigns of recent years (not a mission, it must be said, the human/tech combo of drones have achieved particularly successfully).
“If I’d been kidnapped and a robot came and rescued me, I’d still be happy – but at the same time it’s not the same as having someone you can interact with,” she said.
“If you have a whole bunch of machines fighting in a field – you have to ask what the point is. Unless it’s a question of machines coming over the hill to kill you, and you have your own machines to defend you. War is inherently a human endeavour, and it’s about the face and the contact.”
Watch below: Paul Mason’s report from inside the ‘cockpit’ of a British drone