]

首頁Home 

  聯絡我們Contact us

Human Rights Data

 
‧人權新知NEWS
 
‧世界人權宣言Universal Declaration of Human Rights
 
高雄國際人權宣言Kaohsiung Declaratinn of Human Rights
 
‧人權影音資料館藏Videos
 
‧人權圖書資料館藏Books
 
‧高雄市人權委員會Kaohsiung Human Rights Committee
 
城市人權新聞獎Kaohsiung Human Rights Press Prize
> v
                              NEWS
 

Human rights experts, activists push for ban on 'killer robots'

"Campaign to Stop Killer Robots.” That may sound like a clique of conspiracy theorists or the title of a summer B movie, but it's actually an alliance of human rights groups raising legal and ethical concerns about people's willingness to cede life-and-death decisions to computers.

Who is responsible if an armed robot fails to distinguish between civilians and combatants when unleashing lethal force against a target that meets its programmed criteria?

And how, skeptics wonder, can a “fully autonomous weapon” be taught to recognize soldiers attempting to surrender or those already wounded and no longer a threat?

If national military forces can rely on machines to take on the front-line hazards of armed combat, will that reduced risk of human casualties remove an important deterrent to waging war?

The Campaign to Stop Killer Robots was joined Thursday by a diverse array of peace advocates and diplomats at a session of the U.N. Human Rights Council in calling for reflection on the wisdom of creating lethal technology that operates without human oversight -- and agreed rules for its use.

“Their deployment may be unacceptable because no adequate system of legal accountability can be devised and because robots should not have the power of life and death over human beings,” the United Nations’ watchdog on extrajudicial killings, Christof Heyns, told the council.

In calling for U.N. member nations to freeze development of robotic weapons “while the genie is still in the bottle,” Heyns warned of the risk of rapidly advancing technology outpacing political and moral consideration of unintended consequences.

In a 22-page report submitted to the U.N. rights forum, Heyns detailed the precursors to “fully autonomous weapons” already in operation:

-- Soldier-robots patrol the demilitarized zone between North and South Korea, and though remotely commanded by humans now, the programmed sentinels from Samsung Techwin are equipped with an automatic option.

-- The U.S. Navy launched an unmanned jet this month, the X-47B stealth drone developed by Northrop Grumman. Like generations of aerial drones that came before it, the X-47B is being billed as a surveillance tool. But it also has the capacity to carry more than 4,000 pounds of munitions.

-- Israel’s Harpy combat drone is designed to detect, attack and destroy radar emitters and suppress enemy air defenses.

-- Britain’s BAE Systems has developed its Taranis superdrone, which can autonomously search, locate and identify enemy targets. The device requires human authorization to fire, but it has the technological capability of determining on its own when to attack or respond.

Existing drone technology has stirred plenty of controversy and frustrated relations between the United States, its foremost developer and user, and countries like Pakistan, Afghanistan and Yemen, where airstrikes and targeted killings have inflicted “collateral damage,” the military euphemism for civilian casualties.

Getting the international community united on ground rules for fully autonomous weapons is likely to pose at least as much challenge as balancing the pros and cons of using drones, but one that legal experts contend isn’t beyond the realm of possibility.

There is already significant recognition among the technologically advanced countries that there should be limits to the degree to which computerized systems can take action without human involvement, said Bonnie Docherty, a Harvard Law School lecturer and senior instructor at its International Human Rights Clinic. The rights clinic co-wrote a report with Human Rights Watch late last year on the hazards of leaving battlefield decisions to machines, “Losing Humanity: The Case Against Killer Robots.”

Docherty pointed to the Pentagon’s November directive that fully autonomous weapons would be banned for the foreseeable future except to apply non-lethal or non-physical force, such as some forms of electronic attack.

Steve Goose, arms division director at Human Rights Watch, told journalists covering the U.N. meeting in Geneva this week that several governments have expressed willingness to take the lead in getting a global moratorium on lethal robotics in place.

The burgeoning alliance against “killer robots” is hopeful that world leaders can be brought together on the need for keeping humans in control.

“There is a good chance of success because we are trying to act preemptively, to prevent states from investing so much in this technology that they don’t want to give it up,” said Docherty.

M. Ryan Calo, a University of Washington law professor with expertise in robotics and data security, notes that there are upsides to robotic warfare, like the speed at which computers can make decisions and their ability to approach problem-solving in ways that are beyond humans.


(2013-06-01/latimes)

 
  2009 2010 2011 2012 2013
 
05/29:Burma: Revoke ‘Two-Child Policy’ For Rohingya (hrw)
05/29:US: Take Lead Against Lethal Robotic Weapons(hrw)
05/30:UN Human Rights Council talks killer robots(abc)
05/30:Greece: Strengthen Response to Racist Violence(hrw)
05/31:Angola: Police Disrupt New ‘Disappearances’ Protest(hrw)
05/31:Human rights experts, activists push for ban on 'killer robots'(latimes)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
人權學堂 ∣Human Rights Learning Studio

位置:高雄捷運O5/R10美麗島穹頂大廳方向往出口9
Position: Kaohsiung MRT 05/R10 Formosa Boulevard Hall Exit 9
郵寄地址:81249高雄市小港區大業北路436號
Address: No. 436, Daye North Rd. Siaogang Dist., Kaohsiung City 81249, Taiwan
電話Tel:886-7-2357559∣傳真Fax:886-7-2351129
Email: hr-learning@ouk.edu.tw