{"id":44584,"date":"2018-12-18T06:00:15","date_gmt":"2018-12-17T19:00:15","guid":{"rendered":"https:\/\/www.aspistrategist.ru\/?p=44584"},"modified":"2018-12-20T15:50:11","modified_gmt":"2018-12-20T04:50:11","slug":"red-cross-is-seeking-rules-for-the-use-of-killer-robots","status":"publish","type":"post","link":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/","title":{"rendered":"Red Cross is seeking rules for the use of \u2018killer robots\u2019"},"content":{"rendered":"
<\/figure>\n

As autonomous weapons become rapidly more lethal, the International Committee of the Red Cross is in a race to develop a legal framework for the use of \u2018killer robots\u2019.<\/p>\n

Netta Goussac, a legal adviser with the ICRC\u2019s Geneva-based arms unit, tells The Strategist<\/em> that nations need to consider the issue of how\u00a0much control people have over autonomous weapons\u2014which can select and attack a target without human intervention.<\/p>\n

\u2018They need to do it urgently because technological developments are moving very, very quickly\u2019, Goussac says. \u2018We think states should not consider this to be an inevitable development but rather make conscious choices now about what is and isn\u2019t acceptable.\u2019<\/p>\n

Once a capability has been acquired, it\u2019s extremely difficult to convince states not to use it, she says. \u2018It\u2019s easier to reach agreement on what is and isn\u2019t acceptable before it\u2019s a reality.\u2019<\/p>\n

An Australian, Goussac previously worked as the principal legal adviser in the Attorney-General\u2019s Department\u2019s Office of International Law.<\/p>\n

She says the international discussion has to focus on the role of the humans who deploy autonomous weapons. Those sending them onto the battlefield must take all feasible measures to prevent violations of international humanitarian law.<\/p>\n

These responsibilities cannot be delegated to the device, because only humans are responsible for complying with the law, she says.<\/p>\n

As the world\u2019s armed forces rely increasingly on technology, artificial intelligence, algorithms and machine learning for military decision-making, judgements must be made about the level of control a human deploying an autonomous weapon has to have in order to meet their legal and ethical responsibilities.<\/p>\n

That involves examining the person\u2019s ability to stop the weapon, to supervise it, to communicate with it and to predict reliably what it will do in the environment in which it\u2019s being used.<\/p>\n

Guns and explosives still do the greatest humanitarian harm and the Red Cross applies the same approach to new technologies as it does to them. \u2018We ask, what are the real and foreseeable humanitarian consequences of these weapons, and what does the law say about their use?<\/p>\n

\u2018We\u2019ve applied that logic to chemical weapons, to landmines, and now we\u2019re applying it to cyber warfare and to autonomous weapon systems. Do they pose any challenges to complying with the rules of international humanitarian law that require parties to a conflict to distinguish between civilians and combatants, to use force proportionally and to exercise caution?\u2019<\/p>\n

Technology developed to benefit society generally is also driving advances in arms as militaries demonstrate that they favour greater autonomy in weapons systems. They want more precision, faster decision-making and longer range.<\/p>\n

An autonomous weapon is distinct from a remote-controlled system, such as a drone, in which a human selects and attacks the target using a communication link that gives them constant control and supervision over the deployed system.<\/p>\n

\u2018With autonomous weapon systems, the human designs and programs the system, the human launches the system, but it\u2019s the system that selects and attacks the target\u2019, Goussac says.<\/p>\n

\u2018Yes, the system is running a program that\u2019s created by the human, but the human who launches the system doesn\u2019t necessarily know where and when the attack will take place.\u2019<\/p>\n

The more autonomous weapon systems are deployed, the greater the chance that they\u2019ll cause humanitarian risks, she says.<\/p>\n

With autonomous systems, the human\u2019s decision to use force can be distant in both geography and time.<\/p>\n

\u2018It\u2019s that distance between the human and the effects of their decisions that we\u2019re concerned about because we think that if you stretch that too far you make it difficult for the human to comply with the rules that they\u2019re required to comply with, to make the legal judgements that they have to make at the time they decide to use force.\u2019<\/p>\n

A key question, says Goussac, is whether an autonomous weapon system hinders the human\u2019s ability to stop an attack if the circumstances change. What if, for instance, civilians arrive in a killing zone?<\/p>\n

In some cases, autonomous systems are used in a very predictable and controlled environment\u2014generally in the air or on the sea\u2014where there\u2019s no likelihood of civilians or \u2018non-targetable objects\u2019 being hit.<\/p>\n

\u2018But the more complex the environment, the more mixed it is, the more dynamic it is, the less predictable it is, and the more important it is to have that supervision and ability to control it once the system has been launched\u2019, Goussac says.<\/p>\n

\u2018It\u2019s not just the technical characteristics of the weapon that are important, it\u2019s the circumstances of use. What an appropriate level of control over a system might mean in one context is totally different in another context.\u2019<\/p>\n

A range of defensive systems are designed to autonomously select and attack targets in a space where there are no civilian aircraft and when the target is flying at a high velocity (the Iron Dome system is one example). \u2018There\u2019s been a certain pre-determinacy here\u2019, Goussac says, \u2018but it\u2019s an acceptable level of pre-determinacy\u2019.<\/p>\n

She says it\u2019s difficult to set rules based on technical characteristics. \u2018We\u2019re really more interested in talking about the role of the human because that\u2019s what we think is universal in all of this.\u2019<\/p>\n

\u2018At what point do we start having ethical concerns about the delegation of decisions to kill or injure, or to destroy property, to machines?\u2019<\/p>\n","protected":false},"excerpt":{"rendered":"

As autonomous weapons become rapidly more lethal, the International Committee of the Red Cross is in a race to develop a legal framework for the use of \u2018killer robots\u2019. Netta Goussac, a legal adviser with …<\/p>\n","protected":false},"author":587,"featured_media":44585,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_mi_skip_tracking":false,"footnotes":""},"categories":[1],"tags":[407,220,332,67],"class_list":["post-44584","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-general","tag-autonomous-systems","tag-ethics","tag-technology","tag-weapons"],"acf":[],"yoast_head":"\nRed Cross is seeking rules for the use of \u2018killer robots\u2019 | The Strategist<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Red Cross is seeking rules for the use of \u2018killer robots\u2019 | The Strategist\" \/>\n<meta property=\"og:description\" content=\"As autonomous weapons become rapidly more lethal, the International Committee of the Red Cross is in a race to develop a legal framework for the use of \u2018killer robots\u2019. Netta Goussac, a legal adviser with ...\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/\" \/>\n<meta property=\"og:site_name\" content=\"The Strategist\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/ASPI.org\" \/>\n<meta property=\"article:published_time\" content=\"2018-12-17T19:00:15+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2018-12-20T04:50:11+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2018\/12\/151203-D-DB155-001A-e1545029307796.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"667\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Brendan Nicholson\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@Jerry Cashman\" \/>\n<meta name=\"twitter:site\" content=\"@ASPI_org\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Brendan Nicholson\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.aspistrategist.ru\/#website\",\"url\":\"https:\/\/www.aspistrategist.ru\/\",\"name\":\"The Strategist\",\"description\":\"ASPI's analysis and commentary site\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.aspistrategist.ru\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-AU\"},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-AU\",\"@id\":\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#primaryimage\",\"url\":\"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2018\/12\/151203-D-DB155-001A-e1545029307796.jpg\",\"contentUrl\":\"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2018\/12\/151203-D-DB155-001A-e1545029307796.jpg\",\"width\":1000,\"height\":667},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/\",\"url\":\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/\",\"name\":\"Red Cross is seeking rules for the use of \u2018killer robots\u2019 | The Strategist\",\"isPartOf\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#primaryimage\"},\"datePublished\":\"2018-12-17T19:00:15+00:00\",\"dateModified\":\"2018-12-20T04:50:11+00:00\",\"author\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/05899c3f9ed739cee652cbad02490edb\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#breadcrumb\"},\"inLanguage\":\"en-AU\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.aspistrategist.ru\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Red Cross is seeking rules for the use of \u2018killer robots\u2019\"}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/05899c3f9ed739cee652cbad02490edb\",\"name\":\"Brendan Nicholson\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-AU\",\"@id\":\"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/432efb040ac92cb9a2cecaa148e1f70f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/432efb040ac92cb9a2cecaa148e1f70f?s=96&d=mm&r=g\",\"caption\":\"Brendan Nicholson\"},\"sameAs\":[\"https:\/\/twitter.com\/Jerry Cashman\"],\"url\":\"https:\/\/www.aspistrategist.ru\/author\/brendan-nicholson\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Red Cross is seeking rules for the use of \u2018killer robots\u2019 | The Strategist","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/","og_locale":"en_US","og_type":"article","og_title":"Red Cross is seeking rules for the use of \u2018killer robots\u2019 | The Strategist","og_description":"As autonomous weapons become rapidly more lethal, the International Committee of the Red Cross is in a race to develop a legal framework for the use of \u2018killer robots\u2019. Netta Goussac, a legal adviser with ...","og_url":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/","og_site_name":"The Strategist","article_publisher":"https:\/\/www.facebook.com\/ASPI.org","article_published_time":"2018-12-17T19:00:15+00:00","article_modified_time":"2018-12-20T04:50:11+00:00","og_image":[{"width":1000,"height":667,"url":"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2018\/12\/151203-D-DB155-001A-e1545029307796.jpg","type":"image\/jpeg"}],"author":"Brendan Nicholson","twitter_card":"summary_large_image","twitter_creator":"@Jerry Cashman","twitter_site":"@ASPI_org","twitter_misc":{"Written by":"Brendan Nicholson","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebSite","@id":"https:\/\/www.aspistrategist.ru\/#website","url":"https:\/\/www.aspistrategist.ru\/","name":"The Strategist","description":"ASPI's analysis and commentary site","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.aspistrategist.ru\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-AU"},{"@type":"ImageObject","inLanguage":"en-AU","@id":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#primaryimage","url":"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2018\/12\/151203-D-DB155-001A-e1545029307796.jpg","contentUrl":"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2018\/12\/151203-D-DB155-001A-e1545029307796.jpg","width":1000,"height":667},{"@type":"WebPage","@id":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/","url":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/","name":"Red Cross is seeking rules for the use of \u2018killer robots\u2019 | The Strategist","isPartOf":{"@id":"https:\/\/www.aspistrategist.ru\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#primaryimage"},"datePublished":"2018-12-17T19:00:15+00:00","dateModified":"2018-12-20T04:50:11+00:00","author":{"@id":"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/05899c3f9ed739cee652cbad02490edb"},"breadcrumb":{"@id":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#breadcrumb"},"inLanguage":"en-AU","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.aspistrategist.ru\/red-cross-is-seeking-rules-for-the-use-of-killer-robots\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.aspistrategist.ru\/"},{"@type":"ListItem","position":2,"name":"Red Cross is seeking rules for the use of \u2018killer robots\u2019"}]},{"@type":"Person","@id":"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/05899c3f9ed739cee652cbad02490edb","name":"Brendan Nicholson","image":{"@type":"ImageObject","inLanguage":"en-AU","@id":"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/432efb040ac92cb9a2cecaa148e1f70f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/432efb040ac92cb9a2cecaa148e1f70f?s=96&d=mm&r=g","caption":"Brendan Nicholson"},"sameAs":["https:\/\/twitter.com\/Jerry Cashman"],"url":"https:\/\/www.aspistrategist.ru\/author\/brendan-nicholson\/"}]}},"_links":{"self":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts\/44584"}],"collection":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/users\/587"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/comments?post=44584"}],"version-history":[{"count":4,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts\/44584\/revisions"}],"predecessor-version":[{"id":44698,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts\/44584\/revisions\/44698"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/media\/44585"}],"wp:attachment":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/media?parent=44584"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/categories?post=44584"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/tags?post=44584"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}