thrust group2 presention 020510

45
Thrust Group on International Governance of Robots in National Security CETMONS February 5, 2010

Upload: heatherolsen1

Post on 27-Jun-2015

881 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Thrust group2 presention 020510

Thrust Group on International Governance of Robots in National Security

CETMONSFebruary 5, 2010

Page 2: Thrust group2 presention 020510

Robotics in the Military:Technology and Applications

Ron Arkin

Page 3: Thrust group2 presention 020510
Page 4: Thrust group2 presention 020510

Robots for the Battlefield• South Korean robot platform is intended to be able to detect and identify

targets in daylight within a 4km radius, or at night using infrared sensors within a range of 2km, providing for either an autonomous lethal or non-lethal response. The system does have an automatic mode in which it is capable of making the decision on its own

• iRobot, the maker of Roomba, is now providing versions of their Packbots capable of tasering enemy combatants; also some versions are equipped with the highly lethal MetalStorm weapon system.

• The SWORDS platform developed by Foster-Miller is already at work in Iraq and Afghanistan and is capable of carrying lethal weaponry (M240 or M249 machine guns, or a Barrett .50 Caliber rifle). New MAARS version under development.

• Israel is considering deploying stationary robotic gun-sensor platforms along its borders with Gaza in automated kill zones, equipped with fifty caliber machine guns and armored folding shields.

• The U.S. Air Force has created their first hunter-killer UAV, named the MQ-9 Reaper, successor to the Predator and widely used in Afghanistan.

• China is developing the “Invisible Sword”, a deep strike armed stealth UAV.

• Many other examples both domestically and internationally.

Page 5: Thrust group2 presention 020510
Page 6: Thrust group2 presention 020510

Current Motivators for Military Robotics

Force Multiplication Reduce # of soldiers needed

Expand the Battlespace Conduct combat over larger areas

Extend the warfighter’s reach Allow individual soldiers to strike further

Reduce Friendly Casualties

The use of robotics for reducing ethical infractions in the military does not yet appear anywhere

Page 7: Thrust group2 presention 020510

Samsung Techwin Korean DMZ Surveillance and Guard Robot

Page 8: Thrust group2 presention 020510

War Robots: Concerns & Risks

Patrick Lin, Cal Poly, San Luis Obispo

Ed Barrett, US Naval Academy

Jason Borenstein, Georgia Tech

Page 9: Thrust group2 presention 020510

Overview

Legal challenges

Just-war challenges

Technical challenges

Robot-human challenges

Societal challengesOther and future challenges

Page 10: Thrust group2 presention 020510

1. Legal Challenges

Unclear responsibility To whom would we assign blame—and punishment—

for improper conduct and unauthorized harms caused by an autonomous robot (whether by error or intentional)?

Designers, robot manufacturer, procurement officer, robot controller/supervisor, field commander, a nation’s president or prime minister...or the robot itself?

Refusing an order If robots have better situational awareness, could they

refuse legitimate orders (e.g., to attack a house in which it detects children)?

Page 11: Thrust group2 presention 020510

1. Legal Challenges (cont’d)

Consent by soldiers to risk In 2007, a semi-autonomous robotic cannon

malfunctioned and killed 9 friendly soldiers and injured 14 other in the South African army

Unclear designation of combatants Legal status of UAV operators in the U.S.: e.g., can

they be attacked on their days off work? Legal status of civilians who work on robotic

systems: e.g., are they combatants on the battlefront?

Page 12: Thrust group2 presention 020510

2. Just-War Challenges

Attack decisions Increasing tempo of warfare may require split-second

decisions that only computing machines can make No “eyes on target” or “human in the loop” poses risk of

wrongful attack

Lower barriers to war Fewer US deaths = lower political cost = more likely to go

to war? But this could be said of any new offensive/defensive

technology Do robots enable us to do morally/legally questionable

things that we otherwise wouldn’t do, e.g., Pakistan strikes?

Page 13: Thrust group2 presention 020510

2. Just-War Challenges (cont’d)

Imprecision of Laws of War & Rules of Engagement Using LOW/ROE in programming is incomplete, e.g.

req’t to minimize civilian casualties doesn’t specify hard numbers

Similar to unintended results in Asimov’s Laws of Robotics?

Halting conflict Given the nature of modern warfare, which

individuals/groups of combatants have the authority to end hostilities?

Will/can combatants surrender to robots (or operators or maintenance crew)? If so, what is the appropriate process for handling the situation?

Page 14: Thrust group2 presention 020510

3. Technical Challenges

Discrimination among targets Too difficult? Requires contextual understanding

Robots gone wild Malfunction, hacking, capture

Unauthorized overrides How to prevent a rogue officer from improperly

taking control of a robot?

Page 15: Thrust group2 presention 020510

4. Human-Robot Challenges

Effect on squad cohesion Unblinking eye may erode “band of brothers”

Self-defense If no such instinct, then very expensive equipment

may be captured or lost

Winning hearts and minds Lasting/true peace may be hindered by using robots

to control population and to fight wars (shows lack of respect?)

Page 16: Thrust group2 presention 020510

5. Societal Challenges

Counter-tactics in asymmetrical warfare More desperate enemies = increased terrorism and

other unconventional tactics?

Proliferation Other nations will eventually have war robots, just as

with other weapons

Space race Militarization of space increases space pollution, etc.

Civil security and privacy Military robots may turn into police/civilian security

robots

Page 17: Thrust group2 presention 020510

6. Other/Future Challenges

The precautionary principle Slowing/halting work in order to address serious

risks seems to make sense …but this is in tension with pressure to use robots in

the military

Co-opting of ethics work by military Can justify work in robotics by saying that ethics is

already being attended to

Robot rights In distant future, if robots have animal- or human-

level intelligence

Page 18: Thrust group2 presention 020510

Current Governance Architecture

George R. Lucas, Jr.Richard M. O’Meara

Page 19: Thrust group2 presention 020510

Conventions in International Law for specific technologies

• The 1999 Hague declaration concerning expanding bullets • Convention on the Prohibition of the development, Production

and stockpiling of Bacteriological (Biological) and Toxin Weapons and on their Destruction (1972)

• Convention on the prohibition of military or any hostile use of environmental modification techniques (1976)

• Resolution on Small-Calbre Weapon Systems (1979)• Protocol on Non-Detectable fragments (Protocol 1) (1980)• Protocol on Prohibitions or Restrictions on the Use of Mines,

Booby-Traps and Other Devices (Protocol 11) (1980)• Protocol on Prohibitions or Restrictions on the Use of

Incendiary Weapons (Protocol 111) (1980)

Page 20: Thrust group2 presention 020510

Conventions in International Law for specific technologies, II

• Convention on the prohibition of the development, production, stockpiling and use of chemical weapons and on their destruction (1993)

• Protocol on Blinding Laser weapons (Protocol 1V to the 1980 Convention (1995)

• Protocols on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices as amended on 3 May, 1996 (Protocol 11 to the 1980 Convention as amended on 3 May 1996)

• Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction (1997)

• Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects , Amendment article 1, 21 (2001)

• Protocol 1 Additional to the 1949 Geneva Conventions; Convention on Cluster Munitions (2008). ICRC, ‘International Humanitarian Law-treaties and Documents

Page 21: Thrust group2 presention 020510

Five Core Principles:Int’l Humanitarian Law & LOAC

• Weapons prohibitions: suffering or superfluous injury; otherwise:

• Military necessity• Proportionality• Discrimination• Command responsibility

Page 22: Thrust group2 presention 020510

Weapons Prohibitions

• Some weapons are patently inhumane

• Others are design-dependent (effects are reasonably foreseen)

• Thus, ICRC/SIrUS criteria would ban weapons when:

Page 23: Thrust group2 presention 020510

Use of weapon would foreseeably cause:• A specific disease, specific abnormal

physiological state, a specific and permanent disability or specific disfigurement; or

• Field mortality of more than 25% or a hospital mortality of more than 5%; or

• Grade 3 wounds as measure by the Red Cross wound classification scale; or

• Effects for which there is no well-recognized and proven treatment.

Page 24: Thrust group2 presention 020510

Military Necessity

• Promotes speedy end to hostilities• Requires definition of victory• Requires assessment of intent or

capacity of enemy

Page 25: Thrust group2 presention 020510

Proportionality

• Considerable concern to innovator or user of new technologies

• Is foreseeable destructive capacity “disproportionate” to military objective?

• (Old Saw regarding new technologies: “necessity always trumps proportionality”)

Page 26: Thrust group2 presention 020510

Discrimination

• Not directed against a specifically military objective

• Employ a means or method which cannot be limited to a military objective

• Likely to strike civilian and military targets without distinction

• (Ron Arkin argument: autonomous robots likely superior to humans in this respect)

Page 27: Thrust group2 presention 020510

Command Responsibility

• Liability for illegal actions (Trial of Gen. Yamashita)

• Constrains both actions of soldiers• And orders and jurisdiction of their

commanding officers• (Rob Sparrow objection to autonomous

robots: no meaningful accountability possible)

Page 28: Thrust group2 presention 020510

Other General Governance Provisions or Principles

• Article 36 of 1977 Additional Protocol 1 to the Geneva Conventions of 1949

• “universal jurisdiction”• “Lawfare”

Page 29: Thrust group2 presention 020510

Provisions for Good Governance (O’Meara)• Attempts are clearly defined• Proposals or solutions are realistic• Holistic, involving all stakeholders

in crafting legislation• Subject to assessment of

effectiveness, and amendment

Page 30: Thrust group2 presention 020510

Goal of Technology Governance

• Respect long-term effects• Consider ramifications of actions• Promote consumer/user awareness

of these ramifications

Page 31: Thrust group2 presention 020510

Professional Codes

• Alternative to conventional international law that satisfies these criteria

• Promote (and require) sound professional judgment

• Promote best practices• Define boundaries of acceptable

professional practice

Page 32: Thrust group2 presention 020510

Berkeley Engineers’ Code

I promise to work for a BETTER WORLD where science and technology are used in socially responsible ways. I will not use my EDUCATION for any purpose intended to harm human beings or the environment. Throughout my career, I will consider the ETHICAL implications’ of my work before I take ACTION. While the demands placed upon me may be great, I sign this declaration because I recognize that INDIVIDUAL RESPONSIBILITY is the first step on the path to PEACE

Page 33: Thrust group2 presention 020510

Legally Binding International Agreements and Other

Instruments that Provide Relevant Lessons or Precedent

Orde Kittrie

Page 34: Thrust group2 presention 020510

ICRAC

• The International Committee for Robot Arms Control (ICRAC) – founded in September 2009– Goal: campaign for limiting lethal autonomous robots

through an international agreement modeled on existing arms control agreements

• e.g., those restricting nuclear and biological weapons

• ICRAC called for military robots to be banned from space and said no robotic systems should carry nuclear weapons.

Page 35: Thrust group2 presention 020510

Arms Control Agreements:Types of Restrictions

• Existing legally-binding arms control agreements and other instruments include a wide variety of different types of restrictions on targeted weapons, including prohibitions and limitations (restrictions that fall short of prohibition) on:– acquisition– research and development– testing– deployment– transfer/proliferation– use

Page 36: Thrust group2 presention 020510

Arms Control Agreements:Form/Scope

• Legally binding multilateral agreements (most common)

• Legally binding bilateral agreements

• Legally binding resolutions of the United Nations Security Council

Page 37: Thrust group2 presention 020510

Relevant Precedents?• Nuclear Nonproliferation Treaty• Comprehensive Nuclear Test Ban Treaty• Limited Test Ban Treaty• United Nations Security Council Resolution 1540• Chemical Weapons Convention• Biological Weapons Convention• Mine Ban Treaty• Inter-American Convention on Transparency in Conventional

Weapons Acquisitions • The Strategic Offensive Reductions Treaty• The Conventional Armed Forces in Europe Treaty• Convention on Prohibitions or Restrictions on the Use of Certain

Conventional Weapons which may be Deemed to be Excessively Injurious or to have Indiscriminate Effects (the CCW)

Page 38: Thrust group2 presention 020510

Soft Law Approaches

Gary MarchantLyn Gulley

Page 39: Thrust group2 presention 020510

Transitions in International Oversight of Technology

• Regulation Governance– From government top-down imposed to

private-public partnerships, collaborations, etc.

• Hard Law Soft Law– From enforceable legal agreements to

guidelines, codes of conduct, principles

Page 40: Thrust group2 presention 020510

Advantages of Soft Law

• Voluntary; cooperative

• Reflexive

• Can be adopted or revised relatively quickly

• Many different approaches can be tried simultaneously

• Can be gradually “hardened” into more formal regulatory oversight

Page 41: Thrust group2 presention 020510

Codes of Conduct

• Synthetic Biology– U.S. Government– International Association Synthetic Biology (IASB)– International Gene Synthesis Consortium (IGSC)

• Nanotechnology – Foresight institute Guidleines– Responsible Nanocode– EU Code of Conduct for Nanoresearchers

• Biotechnology– Asilomar Guidelines– 2006 Review Conference for the Biological and Toxin

Weapons Convention

Page 42: Thrust group2 presention 020510

Transgovernmental Dialogue

• Pharmaceuticals– International Conference on Harmonization

(ICH)

• Nanotechnology– OECD Working Group

• Arms Control– Australia Group

Page 43: Thrust group2 presention 020510

Framework Convention

• International agreement negotiated by States• Establishes institutions, processes and

procedures• Minimal (if any) substantive content at first• Encourage broad participation by as many

states as possible• Build trust• Gradually add substance in the form of protocols

Page 44: Thrust group2 presention 020510

Benefits of Framework Conventions

• “In sum, the FC-protocol approach allows states to put in place activities and procedures designed to reduce scientific and technical uncertainty about a problem, and then to act incrementally to address that problem or particular aspects of it, as their knowledge and understanding grow. Politically, the substantive weakness of the original FC helps to attract the broadest possible participation, even if the commitment of some participants is weak or even insincere; as the process unfolds, the aim typically is to enmesh the participants in a process of social learning that will lead them to accept stronger commitments commensurate with the evolving understanding of the problem.”

Abbott, Marchant, et al., 2006

Page 45: Thrust group2 presention 020510

Information Sharing