Should we build a horde of rampaging killbots?

Started by jimmy olsen, April 15, 2015, 06:49:48 PM

Previous topic - Next topic

Should we build armies of autonomous killbots?

Yes, it worked out fine for the Twelve Colonies of Kobol
12 (52.2%)
No, look how bad Skynet turned out
5 (21.7%)
We just need to program the killbots with a kill limit
6 (26.1%)

Total Members Voted: 22

jimmy olsen

http://www.nbcnews.com/tech/tech-news/jody-williams-helped-ban-landmines-can-she-stop-killer-robots-n340661

QuoteJody Williams Helped Ban Landmines. Can She Stop Killer Robots?

By Keith Wagstaff

Jody Williams is on a mission to stop killer robots. The Nobel Peace Prize winner wants an international treaty forbidding machines that can target and kill human beings without requiring a person to pull the trigger.

This week in Geneva, she is part of a group meeting with United Nations delegates who are trying to answer the question, "Do nations regulate killer robots when they arrive or ban them before they can do any damage?"

Sound like science fiction? Drones like those used by the United States already have the capability to shoot Hellfire missiles from the sky. But they are still controlled by a soldier in front of a screen. It's very possible that humans could be removed from that equation, a prospect that worries Williams, who jointly won the Nobel Peace Prize in 1997 for leading the International Campaign to Ban Landmines.

"I grew up during the Cold War," Williams told NBC News from Geneva. "The only thing I wanted my family to have was a bomb shelter, just in case the Soviet Union attacked us."

"I find the very idea of killer robots more terrifying than nukes. Where is humanity going if some people think it's OK to cede the power of life and death of humans over to a machine?"

Williams hopes that she and the Campaign to Stop Killer Robots, which she helped found in 2013, can persuade members of the U.N. to sign an international treaty banning the use of lethal autonomous weapons.

Right now, the U.N. is simply laying the groundwork for more official proceedings, which could eventually lead to a ban. It's not clear, however, if technology will wait for lawmakers to catch up.

Beyond science fiction

There are more than 80 countries with military robotics programs, according to P. W. Singer, senior fellow at the New America Foundation and author of "Wired for War: The Robotics Revolution and Conflict in the 21st Century." The White House recently announced that the U.S. would sell drones to allied nations.

So while "Terminator"-style robots aren't wandering around battlefields yet, that doesn't mean countries aren't trying to develop them.

Samsung Techwin's SGR-A1 robots in South Korea have the ability to autonomously fire on people walking across the DMZ. The Navy's X-47B stealth drones have not been programmed to target humans, but they're smart enough to autonomously take off and land on the deck of an aircraft carrier. Israel Aerospace Industries' Harpy drones can hunt down and crash into missiles without human intervention.

There are even cars that have taken cross-country road trips without someone behind the wheel. The momentum, in the private and military sectors, seems to be toward autonomy.

Experts interviewed for this story were hesitant to put an exact number on when we might see killer robots on the battlefield. Many people believe, however, that it could be very soon. On Monday in Geneva, computer scientist Stuart Russell made his own prediction.

Others agree that this is an issue that we could be dealing with in the near future.

"It's hard for me to imagine an Air Force officer being driven to Creech Air Force Base in Nevada in his Google self-driving car," Singer said, "and then getting out to use a remotely operated drone."

From landmines to killer robots

When Williams helped launch the International Campaign to Ban Landmines in 1992, it wasn't hard to find people who had been affected by them. A Human Rights Watch report from the following year estimated that at least one in 236 people in Cambodia had lost a limb to a landmine.

Hundreds of thousands of people around the world had been killed by them, including U.S. soldiers, who Williams represented while working for the Vietnam Veterans of America Foundation. (The U.S. is not one of the 162 countries who have joined the Ottawa Convention banning anti-personnel landmines).

Landmine explosions provide a lot of visceral stories to sway hearts and minds. Lethal autonomous weapons, on the other hand, have not killed anybody. Yet, in her opinion, they pose a much greater threat.

"A landmine sits there and if someone steps on it, it blows up," she said. "The landmine isn't out targeting and killing people."

So how do you convince people that something that doesn't exist yet is worth banning? She points to the civilian casualties incurred by drone strikes, including the 2013 incident in Yemen where witnesses say a drone killed 12 people who were attending a wedding party.

The Campaign to Stop Killer Robots — which includes members from 54 non-governmental organizations such as Pax Christi International and Amnesty International — claims that any technological safeguards won't be enough to guarantee a robot won't kill civilians, either because of technical errors or indiscriminate algorithms. That is provided the robots aren't used by terror groups or nation-states against their own citizens.

Then there is the question of who is legally liable when a drone kills somebody. Is it the programmer? The military commander in charge of its upkeep? The country that purchased it?

Not everybody thinks that banning lethal autonomous weapons is the answer.

"There are very serious dangers to the proliferation of this technology," Matthew Waxman, a professor at Columbia Law School, told NBC News. "I'm just not persuaded that a blanket prohibition is the right approach."

He believes that robots could potentially make warfare safer for civilians. Facial recognition software, advanced targeting systems, non-lethal projectiles and other technological advances could lead to fewer deaths, he said.

There are many programs, like missile defense systems, which already have a large degree of autonomy when it comes to targeting and firing. Defining what counts as a "lethal autonomous weapon" would be difficult, he said, and a blanket ban could end up stifling technologies that could save lives.

Williams, however, thinks the issue is much more black and white.

"We have nothing against robotics," she said. "We are against machines that — on their own — can target and kill human beings. That's a pretty clear line."

The road ahead

Like in the early days of the International Campaign to Ban Landmines, the Campaign to Stop Killer Robots is a hodgepodge of non-profit organizations with no shared, common budget.

Williams said the group needs to do "some serious fundraising" to put more pressure on countries to ban lethal autonomous weapons.

In Geneva, Elizabeth Quintana of the U.K. military think tank RUSI has already called a ban premature, echoing the opinion of the British government. Many countries are taking a wait-and-see approach.

Preemptive bans are relatively rare, but not unprecedented. Laser weapons designed to permanently blind people were added to the Convention on Certain Conventional Weapons (CCW) in 1995 despite the fact that they were never seen in combat.

Williams hopes that this week's meeting will lead to formal talks next year and eventually the addition of lethal autonomous weapons to the CCW.

"If we come out of this and don't see forward momentum, then we are going to have to rethink our strategy," she said. The landmine ban was created after the Canadian government, frustrated by the lack of progress by the U.N., invited countries to Ottawa to hammer out the text of an international treaty. Something similar, Williams said, could happen with killer robots.

For now, she bristles at the idea that lethal autonomous weapons will sooner or later show up in combat, even with the slow pace of politics and the much faster speed of technology.

"People keep saying that it's inevitable," she said. "Nothing is inevitable. It's only inevitable if you sit on your butt and don't take action to stop things you think are morally and ethically wrong."

First published April 15th 2015, 11:24 pm
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Darth Wagtaros

No. In fact we should encourage their manufacture.
PDH!

jimmy olsen

Quote from: Darth Wagtaros on April 15, 2015, 07:20:10 PM
No. In fact we should encourage their manufacture.
Then vote yes to my poll!
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Monoriu

So it is not ok for a robot to pull the trigger but ok for a human to do so?  I don't get the reasoning.  Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc.  Why are humans inherently better than machines?

Valmy

Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so?  I don't get the reasoning.  Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc.  Why are humans inherently better than machines?

Because humans will have at least some loyalty to their home populations and have to be catered to a bit. A tyrant can rule with an army of killer robots with no checks on his or her power. This would mean any reason to fear the general population and be concerned about the loyalty of the army would be nil. That can't be good.
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."

MadImmortalMan

Everything will have negative externalities. Even killbots. Just part of the process.
"Stability is destabilizing." --Hyman Minsky

"Complacency can be a self-denying prophecy."
"We have nothing to fear but lack of fear itself." --Larry Summers

Monoriu

Quote from: Valmy on April 15, 2015, 07:41:01 PM
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so?  I don't get the reasoning.  Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc.  Why are humans inherently better than machines?

Because humans will have at least some loyalty to their home populations and have to be catered to a bit. A tyrant can rule with an army of killer robots with no checks on his or her power. This would mean any reason to fear the general population and be concerned about the loyalty of the army would be nil. That can't be good.

A human army can also be loyal to a tyrant. 

I think it is not rational to require that killer robots be perfect before we adopt them.  They only need to be better than humans.   

Eddie Teach

Quote from: jimmy olsen on April 15, 2015, 07:20:55 PM
Quote from: Darth Wagtaros on April 15, 2015, 07:20:10 PM
No. In fact we should encourage their manufacture.
Then vote yes to my poll!

You should know better than to make thread title and poll ask opposite questions.
To sleep, perchance to dream. But in that sleep of death, what dreams may come?

jimmy olsen

Quote from: Peter Wiggin on April 15, 2015, 07:58:11 PM
Quote from: jimmy olsen on April 15, 2015, 07:20:55 PM
Quote from: Darth Wagtaros on April 15, 2015, 07:20:10 PM
No. In fact we should encourage their manufacture.
Then vote yes to my poll!

You should know better than to make thread title and poll ask opposite questions.
Given the poll answers, I don't think it matters in this case!  :D
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Admiral Yi

Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so?  I don't get the reasoning.  Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc.  Why are humans inherently better than machines?

Because they don't frighten Jody Williams as much.

Valmy

#10
Quote from: Monoriu on April 15, 2015, 07:53:22 PM
A human army can also be loyal to a tyrant. 

Of course it can. But why would it? Because the tyrant has to placate it and protect its interests. A human army is a check and limiter of a tyrants power because the tyrant has to work to keep it loyal. This would be not be a problem with a robot army.

QuoteI think it is not rational to require that killer robots be perfect before we adopt them.  They only need to be better than humans.

If they are better than humans then whomever controls the robot army would be our absolute master. Is that a rational idea?
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."

jimmy olsen

It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

MadImmortalMan

Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.
"Stability is destabilizing." --Hyman Minsky

"Complacency can be a self-denying prophecy."
"We have nothing to fear but lack of fear itself." --Larry Summers

Valmy

Quote from: MadImmortalMan on April 15, 2015, 08:22:16 PM
Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.

True!

Hey I am not saying it will work out that way but man history shows bad shit happens once the rulers are no longer dependent on the people, like in those petro states.
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."

Monoriu

Quote from: Valmy on April 15, 2015, 08:14:24 PM
Quote from: Monoriu on April 15, 2015, 07:53:22 PM
A human army can also be loyal to a tyrant. 

Of course it can. But why would it? Because the tyrant has to placate it and protect its interests. A human army is a check and limiter of a tyrants power because the tyrant has to work to keep it loyal. This would be not be a problem with a robot army.

QuoteI think it is not rational to require that killer robots be perfect before we adopt them.  They only need to be better than humans.

If they are better than humans then whomever controls the robot army would be our absolute master. Is that a rational idea?

A robot army doesn't exist in a vacuum.  I imagine it also requires power, ammunition, maintenance, upgrades, replacements, parts, etc.  Humans are still in the picture.  We are not talking about Skynet.  Yet  :menace: