Poll
Question:
Should we build armies of autonomous killbots?
Option 1: Yes, it worked out fine for the Twelve Colonies of Kobol
votes: 12
Option 2: No, look how bad Skynet turned out
votes: 5
Option 3: We just need to program the killbots with a kill limit
votes: 6
http://www.nbcnews.com/tech/tech-news/jody-williams-helped-ban-landmines-can-she-stop-killer-robots-n340661
QuoteJody Williams Helped Ban Landmines. Can She Stop Killer Robots?
By Keith Wagstaff
Jody Williams is on a mission to stop killer robots. The Nobel Peace Prize winner wants an international treaty forbidding machines that can target and kill human beings without requiring a person to pull the trigger.
This week in Geneva, she is part of a group meeting with United Nations delegates who are trying to answer the question, "Do nations regulate killer robots when they arrive or ban them before they can do any damage?"
Sound like science fiction? Drones like those used by the United States already have the capability to shoot Hellfire missiles from the sky. But they are still controlled by a soldier in front of a screen. It's very possible that humans could be removed from that equation, a prospect that worries Williams, who jointly won the Nobel Peace Prize in 1997 for leading the International Campaign to Ban Landmines.
"I grew up during the Cold War," Williams told NBC News from Geneva. "The only thing I wanted my family to have was a bomb shelter, just in case the Soviet Union attacked us."
"I find the very idea of killer robots more terrifying than nukes. Where is humanity going if some people think it's OK to cede the power of life and death of humans over to a machine?"
Williams hopes that she and the Campaign to Stop Killer Robots, which she helped found in 2013, can persuade members of the U.N. to sign an international treaty banning the use of lethal autonomous weapons.
Right now, the U.N. is simply laying the groundwork for more official proceedings, which could eventually lead to a ban. It's not clear, however, if technology will wait for lawmakers to catch up.
Beyond science fiction
There are more than 80 countries with military robotics programs, according to P. W. Singer, senior fellow at the New America Foundation and author of "Wired for War: The Robotics Revolution and Conflict in the 21st Century." The White House recently announced that the U.S. would sell drones to allied nations.
So while "Terminator"-style robots aren't wandering around battlefields yet, that doesn't mean countries aren't trying to develop them.
Samsung Techwin's SGR-A1 robots in South Korea have the ability to autonomously fire on people walking across the DMZ. The Navy's X-47B stealth drones have not been programmed to target humans, but they're smart enough to autonomously take off and land on the deck of an aircraft carrier. Israel Aerospace Industries' Harpy drones can hunt down and crash into missiles without human intervention.
There are even cars that have taken cross-country road trips without someone behind the wheel. The momentum, in the private and military sectors, seems to be toward autonomy.
Experts interviewed for this story were hesitant to put an exact number on when we might see killer robots on the battlefield. Many people believe, however, that it could be very soon. On Monday in Geneva, computer scientist Stuart Russell made his own prediction.
Others agree that this is an issue that we could be dealing with in the near future.
"It's hard for me to imagine an Air Force officer being driven to Creech Air Force Base in Nevada in his Google self-driving car," Singer said, "and then getting out to use a remotely operated drone."
From landmines to killer robots
When Williams helped launch the International Campaign to Ban Landmines in 1992, it wasn't hard to find people who had been affected by them. A Human Rights Watch report from the following year estimated that at least one in 236 people in Cambodia had lost a limb to a landmine.
Hundreds of thousands of people around the world had been killed by them, including U.S. soldiers, who Williams represented while working for the Vietnam Veterans of America Foundation. (The U.S. is not one of the 162 countries who have joined the Ottawa Convention banning anti-personnel landmines).
Landmine explosions provide a lot of visceral stories to sway hearts and minds. Lethal autonomous weapons, on the other hand, have not killed anybody. Yet, in her opinion, they pose a much greater threat.
"A landmine sits there and if someone steps on it, it blows up," she said. "The landmine isn't out targeting and killing people."
So how do you convince people that something that doesn't exist yet is worth banning? She points to the civilian casualties incurred by drone strikes, including the 2013 incident in Yemen where witnesses say a drone killed 12 people who were attending a wedding party.
The Campaign to Stop Killer Robots — which includes members from 54 non-governmental organizations such as Pax Christi International and Amnesty International — claims that any technological safeguards won't be enough to guarantee a robot won't kill civilians, either because of technical errors or indiscriminate algorithms. That is provided the robots aren't used by terror groups or nation-states against their own citizens.
Then there is the question of who is legally liable when a drone kills somebody. Is it the programmer? The military commander in charge of its upkeep? The country that purchased it?
Not everybody thinks that banning lethal autonomous weapons is the answer.
"There are very serious dangers to the proliferation of this technology," Matthew Waxman, a professor at Columbia Law School, told NBC News. "I'm just not persuaded that a blanket prohibition is the right approach."
He believes that robots could potentially make warfare safer for civilians. Facial recognition software, advanced targeting systems, non-lethal projectiles and other technological advances could lead to fewer deaths, he said.
There are many programs, like missile defense systems, which already have a large degree of autonomy when it comes to targeting and firing. Defining what counts as a "lethal autonomous weapon" would be difficult, he said, and a blanket ban could end up stifling technologies that could save lives.
Williams, however, thinks the issue is much more black and white.
"We have nothing against robotics," she said. "We are against machines that — on their own — can target and kill human beings. That's a pretty clear line."
The road ahead
Like in the early days of the International Campaign to Ban Landmines, the Campaign to Stop Killer Robots is a hodgepodge of non-profit organizations with no shared, common budget.
Williams said the group needs to do "some serious fundraising" to put more pressure on countries to ban lethal autonomous weapons.
In Geneva, Elizabeth Quintana of the U.K. military think tank RUSI has already called a ban premature, echoing the opinion of the British government. Many countries are taking a wait-and-see approach.
Preemptive bans are relatively rare, but not unprecedented. Laser weapons designed to permanently blind people were added to the Convention on Certain Conventional Weapons (CCW) in 1995 despite the fact that they were never seen in combat.
Williams hopes that this week's meeting will lead to formal talks next year and eventually the addition of lethal autonomous weapons to the CCW.
"If we come out of this and don't see forward momentum, then we are going to have to rethink our strategy," she said. The landmine ban was created after the Canadian government, frustrated by the lack of progress by the U.N., invited countries to Ottawa to hammer out the text of an international treaty. Something similar, Williams said, could happen with killer robots.
For now, she bristles at the idea that lethal autonomous weapons will sooner or later show up in combat, even with the slow pace of politics and the much faster speed of technology.
"People keep saying that it's inevitable," she said. "Nothing is inevitable. It's only inevitable if you sit on your butt and don't take action to stop things you think are morally and ethically wrong."
First published April 15th 2015, 11:24 pm
No. In fact we should encourage their manufacture.
Quote from: Darth Wagtaros on April 15, 2015, 07:20:10 PM
No. In fact we should encourage their manufacture.
Then vote yes to my poll!
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Because humans will have at least some loyalty to their home populations and have to be catered to a bit. A tyrant can rule with an army of killer robots with no checks on his or her power. This would mean any reason to fear the general population and be concerned about the loyalty of the army would be nil. That can't be good.
Everything will have negative externalities. Even killbots. Just part of the process.
Quote from: Valmy on April 15, 2015, 07:41:01 PM
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Because humans will have at least some loyalty to their home populations and have to be catered to a bit. A tyrant can rule with an army of killer robots with no checks on his or her power. This would mean any reason to fear the general population and be concerned about the loyalty of the army would be nil. That can't be good.
A human army can also be loyal to a tyrant.
I think it is not rational to require that killer robots be
perfect before we adopt them. They only need to be
better than humans.
Quote from: jimmy olsen on April 15, 2015, 07:20:55 PM
Quote from: Darth Wagtaros on April 15, 2015, 07:20:10 PM
No. In fact we should encourage their manufacture.
Then vote yes to my poll!
You should know better than to make thread title and poll ask opposite questions.
Quote from: Peter Wiggin on April 15, 2015, 07:58:11 PM
Quote from: jimmy olsen on April 15, 2015, 07:20:55 PM
Quote from: Darth Wagtaros on April 15, 2015, 07:20:10 PM
No. In fact we should encourage their manufacture.
Then vote yes to my poll!
You should know better than to make thread title and poll ask opposite questions.
Given the poll answers, I don't think it matters in this case! :D
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Because they don't frighten Jody Williams as much.
Quote from: Monoriu on April 15, 2015, 07:53:22 PM
A human army can also be loyal to a tyrant.
Of course it can. But why would it? Because the tyrant has to placate it and protect its interests. A human army is a check and limiter of a tyrants power because the tyrant has to work to keep it loyal. This would be not be a problem with a robot army.
QuoteI think it is not rational to require that killer robots be perfect before we adopt them. They only need to be better than humans.
If they are better than humans then whomever controls the robot army would be our absolute master. Is that a rational idea?
Title changed to placate Wiggins. -_-
Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.
Quote from: MadImmortalMan on April 15, 2015, 08:22:16 PM
Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.
True!
Hey I am not saying it will work out that way but man history shows bad shit happens once the rulers are no longer dependent on the people, like in those petro states.
Quote from: Valmy on April 15, 2015, 08:14:24 PM
Quote from: Monoriu on April 15, 2015, 07:53:22 PM
A human army can also be loyal to a tyrant.
Of course it can. But why would it? Because the tyrant has to placate it and protect its interests. A human army is a check and limiter of a tyrants power because the tyrant has to work to keep it loyal. This would be not be a problem with a robot army.
QuoteI think it is not rational to require that killer robots be perfect before we adopt them. They only need to be better than humans.
If they are better than humans then whomever controls the robot army would be our absolute master. Is that a rational idea?
A robot army doesn't exist in a vacuum. I imagine it also requires power, ammunition, maintenance, upgrades, replacements, parts, etc. Humans are still in the picture. We are not talking about Skynet. Yet :menace:
Quote from: Monoriu on April 15, 2015, 08:24:46 PM
Quote from: Valmy on April 15, 2015, 08:14:24 PM
Quote from: Monoriu on April 15, 2015, 07:53:22 PM
A human army can also be loyal to a tyrant.
Of course it can. But why would it? Because the tyrant has to placate it and protect its interests. A human army is a check and limiter of a tyrants power because the tyrant has to work to keep it loyal. This would be not be a problem with a robot army.
QuoteI think it is not rational to require that killer robots be perfect before we adopt them. They only need to be better than humans.
If they are better than humans then whomever controls the robot army would be our absolute master. Is that a rational idea?
A robot army doesn't exist in a vacuum. I imagine it also requires power, ammunition, maintenance, upgrades, replacements, parts, etc. Humans are still in the picture. We are not talking about Skynet. Yet :menace:
Which people would be forced to provide at robot gunpoint. At least until they could be replaced by repair bots.
Quote from: Admiral Yi on April 15, 2015, 08:11:59 PM
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Because they don't frighten Jody Williams as much.
I'm still angry that we can no longer use landmines. They're fantastic defensive and offensive weapons. If you don't want to deal with landmines, don't wage war.
It's the same goofy, emotional reasoning that led to the banning of chemical weapons, whilst burning people to death was a-ok.
Quote from: Ideologue on April 15, 2015, 08:40:54 PM
Quote from: Admiral Yi on April 15, 2015, 08:11:59 PM
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Because they don't frighten Jody Williams as much.
I'm still angry that we can no longer use landmines. They're fantastic defensive and offensive weapons. If you don't want to deal with landmines, don't wage war.
I thought the US never signed that treaty?
Quote from: jimmy olsen on April 15, 2015, 08:41:47 PM
Quote from: Ideologue on April 15, 2015, 08:40:54 PM
Quote from: Admiral Yi on April 15, 2015, 08:11:59 PM
Quote from: Monoriu on April 15, 2015, 07:36:09 PM
So it is not ok for a robot to pull the trigger but ok for a human to do so? I don't get the reasoning. Humans can lose their minds, go crazy, betray their own sides, kill people just for fun, panic, etc. Why are humans inherently better than machines?
Because they don't frighten Jody Williams as much.
I'm still angry that we can no longer use landmines. They're fantastic defensive and offensive weapons. If you don't want to deal with landmines, don't wage war.
I thought the US never signed that treaty?
Iirc, that's true, but we also still don't use avowed landmines as area denial weapons. Am I way off base?
Quote from: Ideologue on April 15, 2015, 08:42:28 PM
Iirc, that's true, but we also still don't use avowed landmines as area denial weapons. Am I way off base?
If this means what I think it means, no. The Korean DMZ is heavily mined.
Quote from: Admiral Yi on April 15, 2015, 08:44:16 PM
Quote from: Ideologue on April 15, 2015, 08:42:28 PM
Iirc, that's true, but we also still don't use avowed landmines as area denial weapons. Am I way off base?
If this means what I think it means, no. The Korean DMZ is heavily mined.
I don't believe that US soldiers man a stretch of the line anymore. They're in bases further back from which they can reinforce or counterattack as appropriate. So those mines, even if planted by the US years ago, are Korean mines now, not American.
Quote from: jimmy olsen on April 15, 2015, 08:21:18 PM
Title changed to placate Wiggins. -_-
What of the other Baker Street Irregulars though? :(
Quote from: Caliga on April 15, 2015, 09:00:51 PM
Quote from: jimmy olsen on April 15, 2015, 08:21:18 PM
Title changed to placate Wiggins. -_-
What of the other Baker Street Irregulars though? :(
Is this a Sherlock Holmes reference? :unsure:
I read a crappy sci-fi novel (really near-future) series on this subject when I was a youth, called "Warbots". In it's history, they used killbots (thought it'd be one human controlling several), but they sucked, so sophisticated nations started augmenting those units with human soldiers again. Though as I recall, it was more of a move towards automated bots with human companions.
They also used submarine aircraft carriers. :nerd:
This was it:
http://jayfort.hubpages.com/hub/The-Warbots-Series-by-G-Harry-Stine
Neil's worst nightmare.
If you criminalize mass-murdering killbots, only criminals will have mass-murdering killbots.
Yes, the movies have shown us the way, it would be glorious, it would bring peace. Ultron said it best in his upcoming movie: "There's only one path to peace. . . Their extinction".
This is the most stupid thing i ever heard.
Every weapon system ever banned has been built sooner or later. Robots and droids are very safe, with built in safety measures to disable in the unlikely case of malfunction or hack. This guy have no idea how the military works. Droids are not going to be release to kill "everything in sight/sensor". That's stupid. The military does not orders or carry a mission without an operational order, mission specific Rules of Engagement, MLECOA and MDECOA analisys (most likely and most deadly enemy courses of action), etc, etc, etc.
We don't go out to any given place just because we want. There is mission generation process based in collected intel, both SIGINT and HUMINT, and then the mission packet its submited by Operations to the Commander, and he makes the final decision whether it is a go or not, based and all his staff recomendation.
When a unit goes out in a mission, everything has been planned and all supports (Drones, fixed and rotary wing aircraft, artillery, mortars) and supplies has been coordinated, including the QRF (quick relief force), which is another unit that will come out to help if the shit hits the fan.
What I am trying to convey to you is that ground combat drones will be integrated into modern warfare within this frame and concept of combat operations. The decision making process will remain the same, and as today, commanders will be ultimately responsible for whatever happens in their missions.
This idea that nobody is responsible IF an smart drone is part of a friendly fire or war crime incident is completely bogus, a misunderstanding of how the military operates and how combat operations are executed, being perpetuated by "activists" without a clue.
The other argument that drives me crazy is when these activists claim it is immoral to use droids in combat because droids don't have emotions. They don't realize this is precisely why droids are so much better than humans in combat. Droids don't have feelings, they don't have a bad day, or debts, or girlfriends back home while they deploy. Droids are not political, or suicidal. They ar the perfect technology to avoid risking human lives in combat unnecesarily. Counter terrorist operations will be changed for ever.
And regardless, humans will be still be going into combat with the droids in the foreseeable future, because the decisions ultimately belong to the commander on the ground.
Not building our droid armies means wasting blood and treasure unnecesarily.
Quote from: MadImmortalMan on April 15, 2015, 08:22:16 PM
Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.
And the use of killer robots would require a unanimous decision in a Languish poll. Which means they will never be used.
We would be like ancient guardians of Shangri La.
Quote from: Martinus on April 16, 2015, 02:46:50 PM
Quote from: MadImmortalMan on April 15, 2015, 08:22:16 PM
Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.
And the use of killer robots would require a unanimous decision in a Languish poll. Which means they will never be used.
Nonsense. Include a jaron option (Jaron wopuld use them) and you'd get unanimity.
I'm fine with killbots as long as they're also sexbots.
Quote from: The Brain on April 18, 2015, 01:26:52 AM
I'm fine with killbots as long as they're also sexbots.
What if they're programmed with AI based on Battlefield players, so that they kill you first and then rape you?
Quote from: grumbler on April 17, 2015, 07:58:58 PM
Quote from: Martinus on April 16, 2015, 02:46:50 PM
Quote from: MadImmortalMan on April 15, 2015, 08:22:16 PM
Then perhaps it should be Languish that builds the killbot army. He who controls the killbots can never be a slave.
And the use of killer robots would require a unanimous decision in a Languish poll. Which means they will never be used.
Nonsense. Include a jaron option (Jaron wopuld use them) and you'd get unanimity.
Hear, hear!
Quote from: Malthus on April 16, 2015, 07:40:32 AM
If you criminalize mass-murdering killbots, only criminals will have mass-murdering killbots.
You're absolutely right, thankfully the 2nd amendment protects the right of private citizens to build their own killbots. :smarty:
http://www.huffingtonpost.com/entry/drone-gun-connecticut_55af1016e4b0a9b94852f95b?ncid=fcbklnkushpmg00000052
QuoteConnecticut Teen Made A Drone That Fires A Semi-Automatic Handgun -- And Police Say It's Legal
"At this point, we can't find anything that's been violated."
A Connecticut teen has added a handgun to a drone, creating a weapon that can be fired remotely -- and police say it appears to be perfectly legal.
Austin Haughwout, 18, posted a clip online showing the drone firing four shots, which triggered a police investigation.
But police in Haughwout's hometown of Clinton say they can't find anything to charge him with.
"It would seem to the average person, there should be something prohibiting a person from attaching a weapon to a drone," Clinton Police Chief Todd Lawrie said in a statement cited by WTNH, the ABC affiliate in New Haven. "At this point, we can't find anything that's been violated."
At least one law enforcement expert believes reckless conduct charges could apply in this case.
"What if the drone gets beyond the distance of the radio control?" Tom Fuentes, a former assistant director of the FBI, told CNN Wire. "Do we want drones out of control that could land who knows here? We could have a child pick up the drone, pick up the gun, and accidentally kill themselves."
Brett Haughwout, who is the teen's father, told WFSB, the CBS station in Hartford, that his son made the drone-weapon with help from his professor at Central Connecticut State University.
However, a professor at the university disputed that, telling the Hartford Courant that it was a "terrible idea."
"I discouraged him," Edward Moore, an assistant professor who teaches a class called Manufacturing Engineering Processes, told the paper. "I tried to give him the same advice I would give my kids."
Haughwout's father told NBC Connecticut that the gun belongs to him and said his son did "extensive research" to ensure it didn't break any laws.
"Homemade multirotor with a semi-automatic handgun mounted on it," the description on YouTube reads. "Note: The length from the muzzle to the rear of the frame is over 26"."
The reference to the length may be an attempt to comply with federal law governing overall length of rifles, should the gun as modified be considered as such.
While local police say there may not be a law against the drone-gun, Haughwout may not be in the clear just yet. CNET reports that the FAA is also investigating.
"The FAA will investigate the operation of an unmanned aircraft system in a Connecticut park to determine if any Federal Aviation Regulations were violated," the agency told the website. "The FAA will also work with its law enforcement partners to determine if there were any violations of criminal statutes."
That comes as welcome news to at least one drone advocate.
"Drones should be used for good, not for evil," Peter Sachs, who is an attorney, told ABC News. "There are countless ways that drones can be useful. Using one as a remote-controlled weapon is not one of them, and I question the judgment of anyone who would attempt to do so."
Haughwout made headlines last year when he was allegedly assaulted by a woman who claimed he was using a drone to photograph her.
However, video showed that Haughwout was just taking aerial shots of the beach.
Andrea Mears, then 23, was charged with third-degree assault and breach of peace. However, the charges will be dropped if she completes two years of probation, according to the New York Post.
That thing's about as much a threat as the average person driving down the street.
Quote from: Maximus on July 22, 2015, 09:38:50 PM
That thing's about as much a threat as the average person driving down the street.
Pretty dangerous then! :lol: