News:

And we're back!

Main Menu

Campaign to Stop Killer Robots!

Started by jimmy olsen, April 29, 2013, 07:35:49 PM

Previous topic - Next topic

jimmy olsen

These "activists" are just a front for Sarah Connor's luddite terrorist organization!  :mad:

http://www.nbcnews.com/technology/futureoftech/activists-un-put-killer-robots-crosshairs-6C9633925
QuoteActivists, UN put 'killer robots' in the crosshairs

Nearly every fighting ship in the U.S. Navy carries a Phalanx defense system, a computerized Gatling gun set on a six-ton mount that uses radar to spot targets flying out of the sky, or cruising across the ocean's surface. Once it "evaluates, tracks, engages and performs a kill assessment," a human gives the order to rattle off 4,500 rounds per minute.

This sort of "supervised" automation is not out of the ordinary. When Israel's "Iron Dome" radar spots incoming missiles, it can automatically fire a counter missile to intercept it. The German Air Force's Skyshield system can now also shoot down its targets with very little human interaction.

For years, "sniper detectors" have pointed telltale lasers at shooters who are firing on troops; DARPA is even working on a version that operates "night and day" from a moving military vehicle that's under fire. Meanwhile, sniper rifles themselves are getting smarter: In the case of the TrackingPoint precision guided firearm, the operator pulls the trigger, but the gun's built-in computer decides when the bullet flies.

"We are not in the 'Terminator' world and we may never reach there," says Peter Singer, author of "Wired for War" and director of the Center for 21st Century Security and Intelligence at the Brookings Institution. "But to say there isn't an ever increasing amount of autonomy to our systems — that's fiction."

Preparing for a future in which robots may be given a tad more independence, an international coalition of humans rights organizations including Human Rights Watch are banding together to propose a treaty ban on "killer robots."

The Campaign to Stop Killer Robots publicly launched April 23 with the goal of bringing the discussion about autonomous weapons systems to regular people, not just politicians and scientists. Also this month, the United Nations Special Rapporteur recommended a suspension of autonomous weapons — or "lethal autonomous robotics" — until their control and use is discussed in detail. But critics of those reports argue that it's too early to call for a ban because the technology in question does not yet exist. Others say this is the reason to start talking now.

"Our feeling is that [it is] morally and ethically wrong that these machines make killing decisions rather than humans [making] killing decisions," Stephen Goose, director of the arms division at the Human Rights Watch, told NBC News.

The group clarifies that it isn't anti-robot, or anti-autonomy — or even anti-drone. It's just that when a decision to kill is made in a combat situation, they want to ensure that decision will always be made by a human being.

Goose says the title of the new campaign is deliberately provocative and designed to catalyze conversation. He said, "If you have a campaign to stop 'Fully autonomous weapons,' you will fall asleep."

"The problem with modern robotics is there's no way a robot can discriminate between a civilian and a soldier," said Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield in the U.K. and an outspoken advocate for "robot arms control." "They can just about tell the difference between a human and a car."

But a treaty prohibition at this time is unnecessary and "might even be counterproductive," cautions Matthew Waxmann, a national security and law expert at Columbia Law School. Waxmann told NBC News that he anticipates a day when robots may be better than human beings at making important decisions, especially in delicate procedures like surgeries.

"In some of these contexts, we are going to decide not only is it appropriate for machines to operate autonomously, we may demand it, because we are trying to reduce human error," said Waxmann.

Michael Schmitt, professor of international law and chairman of the U.S. Naval War College, told NBC News that a ban now, as a matter of law, is a "bad idea." When the Human Rights Watch wrote a 50-page report on the future of robotic warfare, Schmitt wrote a rebuttal in Harvard's National Security Journal. His main argument: "International humanitarian law's restrictions on the use of weapons ... are sufficiently robust to safeguard humanitarian values during the use of autonomous weapon systems."

Singer, whose work has made him an ombudsman in the growing debate over robotic warfare, says that now is the time to talk — now, when Google cars are guiding themselves through San Francisco's streets and algorithm-powered stock trading accounts crash markets based on keywords.

Singer thinks the debate needs to gain traction before governments and big companies become invested in the technology — and begin to influence the direction of policy. "People aren't pushing for more autonomy in these systems because it is cool. They're pushing for it because companies think they can make money out of it," he said.

Autonomous weapon systems that can operate independently are "not centuries away," Singer told NBC News. "We're more in the years and decades mode."
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Darth Wagtaros

PDH!

jimmy olsen

Can only be watched in the United States! :weep:
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

11B4V

QuoteGerman Air Force's Skyshield system

:o, Oh, Skyshield. That was a close one.
"there's a long tradition of insulting people we disagree with here, and I'll be damned if I listen to your entreaties otherwise."-OVB

"Obviously not a Berkut-commanded armored column.  They're not all brewing."- CdM

"We've reached one of our phase lines after the firefight and it smells bad—meaning it's a little bit suspicious... Could be an amb—".

jimmy olsen

Quote from: 11B4V on April 30, 2013, 02:07:18 AM
QuoteGerman Air Force's Skyshield system

:o, Oh, Skyshield. That was a close one.
:o
http://en.wikipedia.org/wiki/Skynet_%28satellite%29
QuoteSkynet is a family of military satellites, now operated by Astrium Services on behalf of the UK Ministry of Defence, which provide strategic communication services to the three branches of the British Armed Forces and to NATO forces engaged on coalition tasks.
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Warspite

Professor Noel Sharkey knows balls-all about how militaries conduct their affairs, and indeed from analysis I've read of his, knows less about robotics than should be the case. It's infuriating how he keeps getting wheeled out for comment on anything to do with autonomous weapon systems in the press.

There are very good arguments against fully autonomous weapon systems. But the killer robots campaign is not making them.
" SIR – I must commend you on some of your recent obituaries. I was delighted to read of the deaths of Foday Sankoh (August 9th), and Uday and Qusay Hussein (July 26th). Do you take requests? "

OVO JE SRBIJA
BUDALO, OVO JE POSTA

mongers

Quote from: Warspite on April 30, 2013, 05:22:09 AM
Professor Noel Sharkey knows balls-all about how militaries conduct their affairs, and indeed from analysis I've read of his, knows less about robotics than should be the case. It's infuriating how he keeps getting wheeled out for comment on anything to do with autonomous weapon systems in the press.

There are very good arguments against fully autonomous weapon systems. But the killer robots campaign is not making them.

So, please could you outline some of those, in the interests of this debate ?
"We have it in our power to begin the world over again"

Iormlund

Quote from: nbc
But a treaty prohibition at this time is unnecessary and "might even be counterproductive," cautions Matthew Waxmann, a national security and law expert at Columbia Law School. Waxmann told NBC News that he anticipates a day when robots may be better than human beings at making important decisions, especially in delicate procedures like surgeries.

"In some of these contexts, we are going to decide not only is it appropriate for machines to operate autonomously, we may demand it, because we are trying to reduce human error," said Waxmann.

I said quite a while ago that this will happen with cars. Once they become statistically better drivers than humans either your insurance or the law will prevent you from driving manually.

Warspite

#8
Quote from: mongers on April 30, 2013, 07:12:01 AM
Quote from: Warspite on April 30, 2013, 05:22:09 AM
Professor Noel Sharkey knows balls-all about how militaries conduct their affairs, and indeed from analysis I've read of his, knows less about robotics than should be the case. It's infuriating how he keeps getting wheeled out for comment on anything to do with autonomous weapon systems in the press.

There are very good arguments against fully autonomous weapon systems. But the killer robots campaign is not making them.

So, please could you outline some of those, in the interests of this debate ?

The procurement chain: given that few states actually manufacture high-tech armaments, most would buy autonomous systems off-the-shelf. If accountability for the action of autonomous systems lies with the commander and the software programmer, there are obvious complications in multi-national procurement chains (or, indeed, the reality these days of many separate contractors contributing to even small component systems).

Related to the point above, I don't think there's yet a definitive answer on where moral accountability lies for a mistake by an autonomous weapon system. Is it the commander who orders the strike, or the team who developed the decision-making coding? Or both, depending on the circumstances?

One could probably qualify these by saying they're not insurmountable hurdles.

On the other hand, the criticism that "killer robots", normally illustrated with a picture of the Terminator, can't show compassion or make moral decisions like humans do is not the strongest point to make:



And there are plenty of human weapon systems where, due to the nature of the mission, the objective is pre-programmed and thus moral compliance or not rests with a party remote to the payload delivery team - e.g. the intelligence that generates an aerial sortie or artillery fire mission.

In other words, the "killer robots" campaign does not actually show how autonomous weapon systems necessarily cannot fulfil the criteria of proportionality and discrimination that are the key tests for whether a weapon system is acceptable under International Humanitarian Law.

The funny thing is, if you actually talk to people in the military who have the job of integrating these sorts of potential systems into doctrine and concepts, they're emphatic that no one is thinking of getting machines that don't have a human in the kill-decision. So as one guy put it on Twitter, this campaign is like trying to ban fusion-powered armoured unicorns.
" SIR – I must commend you on some of your recent obituaries. I was delighted to read of the deaths of Foday Sankoh (August 9th), and Uday and Qusay Hussein (July 26th). Do you take requests? "

OVO JE SRBIJA
BUDALO, OVO JE POSTA

mongers

Quote from: Warspite on April 30, 2013, 08:24:11 AM
Quote from: mongers on April 30, 2013, 07:12:01 AM
Quote from: Warspite on April 30, 2013, 05:22:09 AM
Professor Noel Sharkey knows balls-all about how militaries conduct their affairs, and indeed from analysis I've read of his, knows less about robotics than should be the case. It's infuriating how he keeps getting wheeled out for comment on anything to do with autonomous weapon systems in the press.

There are very good arguments against fully autonomous weapon systems. But the killer robots campaign is not making them.

So, please could you outline some of those, in the interests of this debate ?

The procurement chain: given that few states actually manufacture high-tech armaments, most would buy autonomous systems off-the-shelf. If accountability for the action of autonomous systems lies with the commander and the software programmer, there are obvious complications in multi-national procurement chains (or, indeed, the reality these days of many separate contractors contributing to even small component systems).

Related to the point above, I don't think there's yet a definitive answer on where moral accountability lies for a mistake by an autonomous weapon system. Is it the commander who orders the strike, or the team who developed the decision-making coding? Or both, depending on the circumstances?

One could probably qualify these by saying they're not insurmountable hurdles.

On the other hand, the criticism that "killer robots", normally illustrated with a picture of the Terminator, can't show compassion or make moral decisions like humans do is not the strongest point to make:



And there are plenty of human weapon systems where, due to the nature of the mission, the objective is pre-programmed and thus moral compliance or not rests with a party remote to the payload delivery team - e.g. the intelligence that generates an aerial sortie or artillery fire mission.

In other words, the "killer robots" campaign does not actually show how autonomous weapon systems necessarily cannot fulfil the criteria of proportionality and discrimination that are the key tests for whether a weapon system is acceptable under International Humanitarian Law.

The funny thing is, if you actually talk to people in the military who have the job of integrating these sorts of potential systems into doctrine and concepts, they're emphatic that no one is thinking of getting machines that don't have a human in the kill-decision. So as one guy put it on Twitter, this campaign is like trying to ban fusion-powered armoured unicorns.

Cheers.
"We have it in our power to begin the world over again"

Viking

Don't forget this is a red herring (over and above the fact that independent weapons do not exist nor is there any plan on creating them). We do have mass murder on a massive scale going on right now as we speak and the only robots involved are the ones making sure that the most murderous of the psychopaths on the ground don't get their hands on chemical weapons.

There are real issues involved here when setting rules of engagement for robots, they need to be discussed and thought through. The problem is that there is a whole wing of opinionators that are not interested in the real issues but seek to ban based on an inability to understand or comprehend how these things work.

Plus I have a hard time seeing the moral difference between a robotic ai shooting up a free fire zone and a mine-field in the same area.
First Maxim - "There are only two amounts, too few and enough."
First Corollary - "You cannot have too many soldiers, only too few supplies."
Second Maxim - "Be willing to exchange a bad idea for a good one."
Second Corollary - "You can only be wrong or agree with me."

A terrorist which starts a slaughter quoting Locke, Burke and Mill has completely missed the point.
The fact remains that the only person or group to applaud the Norway massacre are random Islamists.

11B4V

Quote from: jimmy olsen on April 30, 2013, 03:28:03 AM
Quote from: 11B4V on April 30, 2013, 02:07:18 AM
QuoteGerman Air Force's Skyshield system

:o, Oh, Skyshield. That was a close one.
:o
http://en.wikipedia.org/wiki/Skynet_%28satellite%29
QuoteSkynet is a family of military satellites, now operated by Astrium Services on behalf of the UK Ministry of Defence, which provide strategic communication services to the three branches of the British Armed Forces and to NATO forces engaged on coalition tasks.

Damned English will doom us.  :mad:
"there's a long tradition of insulting people we disagree with here, and I'll be damned if I listen to your entreaties otherwise."-OVB

"Obviously not a Berkut-commanded armored column.  They're not all brewing."- CdM

"We've reached one of our phase lines after the firefight and it smells bad—meaning it's a little bit suspicious... Could be an amb—".

The Minsky Moment

Fusion powered armoured unicorns should be banned.  Armoring unicorns might be acceptable if only virgins are involved in the installation.  But forcing unicorns to cross the nuclear threshhold is an unacceptable violation of fantastic animal rights.
The purpose of studying economics is not to acquire a set of ready-made answers to economic questions, but to learn how to avoid being deceived by economists.
--Joan Robinson

jimmy olsen

The campaign to ban fusion powered armored unicorns picks up momentum!

http://www.nbcnews.com/technology/terminator-hold-debate-stop-killer-robots-takes-global-stage-8C11433704
Quote'Terminator' on hold? Debate to stop killer robots takes global stage

Nidhi Subbaraman NBC News

5 hours ago

A proposal to pause the development of "killer robot" technology is seeing a surge of interest from robotics researchers as well as the representatives of key nations at the United Nations this month.

At a UN General Assembly First Committee on Disarmament and International Security side event Monday, mission delegates from Egypt, France, and Switzerland voiced an interest in regulating "killer robots" — completely autonomous weapon systems — in warfare. They are some of the first international voices backing ideas that the Human Rights Watch and Campaign to Stop Killer Robots have been championing for about a year.

But before deliberations about regulating killer robots can take place, experts say they want more transparency from governments already using semi-autonomous systems, like the Phalanx naval weapon system, that to a degree can fire on their own, without a human "pulling the trigger."

"We are not luddites, we are not trying to stop the advance of robotics," Jody Williams, 1997 Nobel Peace Prize laureate, and one of the panelists at Monday's UN event, said. But, "I don't want to see robots operating on their own, armed with lethal weapons."

The Campaign to Stop Killer Robots launched in April this year, and calls for a ban on weapon systems that can make target and kill decisions without a human "in the loop." The launch followed a detailed report published by the HRW on the dangers of future "killer robot" technologies. Christof Heyns, UN Special Rapporteur, presented a report on lethal autonomous weapons at the UN Human Rights Council in June this year. In it, he called for a ban on "certain aspects" of killer robots, and encouraged policy discussion about how to regulate them, at a national and international level.

And one future forum for discussion has been proposed. Anais Laigle, First Secretary and representative from the France Permanent Mission to the UN, said Monday that killer bots will be "included in the agenda" at the Convention on Conventional Weapons in November this year, a meeting chaired by France.

At Monday's event, delegates from Egypt and Switzerland also indicated an interest in talking discussing the development and regulation of lethal autonomous weapons technologies.

In a statement released earlier this month, Pakistan UN representatives said that the development of drones and killer robots "need to be checked and brought under international regulation," and Egypt agreed that regulations are needed before killer robots "are to be developed and/or deployed." Austria and France have mentioned an interest in regulating killer robots, and Algeria, Brazil, Germany, Morocco and the United States have raised their hands as well.

It's not just policy makers. As of last week, more than 270 researchers signed a statement backing a ban on developing or using weapons that fire without human decision.

Some scholars, like Matthew Waxman at Columbia Law School and Kenneth Anderson at American University are opposed to that statement, arguing that a treaty ban is "unnecessary and dangerous." Autonomous systems are in our future, and if governments don't use them — perhaps in a regulated way — they'll fall to use by rebel groups and non-government actors. Also, sophisticated weapons systems could one day be better than humans at locating targets, they say.

But as a starting point for discussion, more information would help, Richard Moyes, a managing partner at non-profit organization Article 36, said at the panel on Monday. He wants governments to share information about how existing semi-autonomous weapons and operations work. So, rather than considering hypothetical "Terminator" scenarios, the data "will help us have a more concrete debate going forward," he said at the panel on Monday.

But secrecy has been a hallmark of the drones program in the US, where the capabilities and operations, and laws and policies governing the use of those systems are kept under wraps. "One of the biggest concerns" for Sarah Knuckey, professor of human rights law at NYU, who advised the Special Rapporteur Heynes, is that the secrecy will continue.

"If the US carries on this very non-transparent track," she told NBC News, "We're not going to know what laws are going to be programmed [into the robots], and where they're going to be used."

More transparency is exactly what Williams and the team from the Campaign to Stop Killer Robots wants, too. "I don't like my tax dollars being used on weapons that are not even discussed in the public domain," she said on Monday. "We have every right and every responsibility to have a public discussion as to where war is going."
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Ideologue

And when they grab you with those metal claws, you can't break free.  Because they're made of metal, and robots are strong.
Kinemalogue
Current reviews: The 'Burbs (9/10); Gremlins 2: The New Batch (9/10); John Wick: Chapter 2 (9/10); A Cure For Wellness (4/10)