Why I Have Joined the International Committee on Robot Arms Control
It was the autonomy of landmines that disturbed me. I had just finished university and was volunteering in northeastern Bosnia for an international NGO doing agricultural and environmental development. The area I lived in, the Brcko District, was one of the most heavily impacted by the hundreds of mines and unexploded munitions left strewn across the landscape after the war’s end. In Bosnia, and later as I travelled to Iraq, Sudan, Uganda and Afghanistan as a humanitarian aid worker, I found in mines a symbol of the worst of humanity’s militarist, xenophobic and uncritically technophile proclivities. Mines seemed inherently unjust, machines capable of maiming and killing without any deliberation by human beings.
When we delegate violent force to a mine, we outsource the decision to end human life to a device that has no ethics, no wisdom, no mercy, no conscience. It has no capacity to negotiate, to de-escalate, to hold fire, to recognize the humanity of ‘The Other.’ It is literally anti-personnel — anti-human. It follows only The Program inherent in its design:
IF victim trips trigger, trip-wire, pressure plate or sensor,
In pressing for the ban on anti-personnel landmines, the International Campaign to Ban Landmines re-asserted humanity’s responsibility to itself. The treaty, ratified now by 159 countries and largely followed by many non-signatories, represented a consensus that humans have the right not to be killed by autonomous machines.
Mines, however, are becoming outdated weapons. They are analog, unnetworked, static. They are to the emerging class of high-tech weapons as the slide rule is to the smartphone. The rapid development of autonomy in robotics – the ability of robots to make decisions independently of human involvement — means we are on the cusp of a technological revolution in the way weapons select and attack targets. Weapons developers have already designed and/or produced aerial, underwater and land-based military robots capable of operating independent of human involvement — killing and maiming without direct human command and control. The fully autonomous armed robot is a digital, artificially intelligent, GPS-enabled and sometimes mobile landmine.
In a recent report on “the challenges of contemporary armed conflicts“, the International Committee of the Red Cross, which was a major leader in the mine ban campaign, called on the global community to consider the “fundamental legal, ethical and societal issues” raised by autonomous armed robots “before such systems are developed or deployed.”
Without human control, there is no way for weapons to effectively discriminate between combatants and civilians, making fully autonomous weapons a violation of international humanitarian law. To me, autonomous armed robots, even highly-sophisticated ones, are no less morally problematic than the landmines I found so distrubing as a 21-year-old volunteer in Bosnia. As I wrote (with Thomas Nash and Richard Moyes) in a statement for the advocacy group Article 36 calling for a ban on autonomous armed robots:
Weapons that are triggered automatically by the presence or proximity of their victim can rarely be used in a way that ensures distinction between military and civilian. … Whilst an expanded role for robots in conflict looks unstoppable, we need to draw a red line at fully autonomous targeting. A first step in this may be to recognize that such a red line needs to be draw effectively across the board – from the simple technologies of anti-vehicle landmines (still not prohibited) across to the most complex systems under development.
Last week I joined the International Committee for Robot Arms Control (ICRAC), a group of academics, activists and campaigners who are “concerned about the pressing dangers that military robots pose to peace and international security and to civilians in war.” Like ICRAC, I believe we need to “urgently commence discussions about an arms control regime to reduce the threat posed to humanity by these systems.” We cannot leave deliberations about the humanitarian and human rights implications of these weapons to those who develop and use them. We need the dedicated engagement of civilians, of civil society: the faith community, journalists, social workers, aid workers, campaigners, poets, musicians, academics and policy wonks. The decision to kill is too serious a matter to leave to a machine.