QuoteU.S. assembling secret drone bases in Africa, Arabian Peninsula, officials say
By Craig Whitlock and Greg Miller, Published: September 20, Washpost.com
The Obama administration is assembling a constellation of secret drone bases for counterterrorism operations in the Horn of Africa and the Arabian Peninsula as part of a newly aggressive campaign to attack al-Qaeda affiliates in Somalia and Yemen, U.S. officials said.
One of the installations is being established in Ethiopia, a U.S. ally in the fight against al-Shabab, the Somali militant group that controls much of that country. Another base is in the Seychelles, an archipelago in the Indian Ocean, where a small fleet of "hunter-killer" drones resumed operations this month after an experimental mission demonstrated that the unmanned aircraft could effectively patrol Somalia from there.
The U.S. military also has flown drones over Somalia and Yemen from bases in Djibouti, a tiny African nation at the junction of the Red Sea and the Gulf of Aden. In addition, the CIA is building a secret airstrip in the Arabian Peninsula so it can deploy armed drones over Yemen.
The rapid expansion of the undeclared drone wars is a reflection of the growing alarm with which U.S. officials view the activities of al-Qaeda affiliates in Yemen and Somalia, even as al-Qaeda's core leadership in Pakistan has been weakened by U.S. counterterrorism operations.
The U.S. government is known to have used drones to carry out lethal attacks in at least six countries: Afghanistan, Iraq, Libya, Pakistan, Somalia and Yemen. The negotiations that preceded the establishment of the base in the Republic of Seychelles illustrate the efforts the United States is making to broaden the range of its drone weapons.
The island nation of 85,000 people has hosted a small fleet of MQ-9 Reaper drones operated by the U.S. Navy and Air Force since September 2009. U.S. and Seychellois officials have previously acknowledged the drones' presence but have said that their primary mission was to track pirates in regional waters. But classified U.S. diplomatic cables show that the unmanned aircraft have also conducted counterterrorism missions over Somalia, about 800 miles to the northwest.
The cables, obtained by the anti-secrecy group WikiLeaks, reveal that U.S. officials asked leaders in the Seychelles to keep the counterterrorism missions secret. The Reapers are described by the military as "hunter-killer" drones because they can be equipped with Hellfire missiles and satellite-guided bombs.
To allay concerns among islanders, U.S. officials said they had no plans to arm the Reapers when the mission was announced two years ago. The cables show, however, that U.S. officials were thinking about weaponizing the drones.
During a meeting with Seychelles President James Michel on Sept. 18, 2009, American diplomats said the U.S. government "would seek discrete [sic], specific discussions . . . to gain approval" to arm the Reapers "should the desire to do so ever arise," according to a cable summarizing the meeting. Michel concurred, but asked U.S. officials to approach him exclusively for permission "and not anyone else" in his government, the cable reported.
Michel's chief deputy told a U.S. diplomat on a separate occasion that the Seychelles president "was not philosophically against" arming the drones, according to another cable. But the deputy urged the Americans "to be extremely careful in raising the issue with anyone in the Government outside of the President. Such a request would be 'politically extremely sensitive' and would have to be handled with 'the utmost discreet care.' "
A U.S. military spokesman declined to say whether the Reapers in the Seychelles have ever been armed.
"Because of operational security concerns, I can't get into specifics," said Lt. Cmdr. James D. Stockman, a public affairs officer for the U.S. Africa Command, which oversees the base in the Seychelles. He noted, however, that the MQ-9 Reapers "can be configured for both surveillance and strike."
A spokeswoman for Michel said the president was unavailable for comment.
Jean-Paul Adam, who was Michel's chief deputy in 2009 and now serves as minister of foreign affairs, said U.S. officials had not asked for permission to equip the drones with missiles or bombs.
"The operation of the drones in Seychelles for the purposes of counter-piracy surveillance and other related activities has always been unarmed, and the U.S. government has never asked us for them to be armed," Adam said in an e-mail. "This was agreed between the two governments at the first deployment and the situation has not changed."
The State Department cables show that U.S. officials were sensitive to perceptions that the drones might be armed, noting that they "do have equipment that could appear to the public as being weapons."
To dispel potential concerns, they held a "media day" for about 30 journalists and Seychellois officials at the small, one-runway airport in Victoria, the capital, in November 2009. One of the Reapers was parked on the tarmac.
"The government of Seychelles invited us here to fight against piracy, and that is its mission," Craig White, a U.S. diplomat, said during the event. "However, these aircraft have a great deal of capabilities and could be used for other missions."
In fact, U.S. officials had already outlined other purposes for the drones in a classified mission review with Michel and Adam. Saying that the U.S. government "desires to be completely transparent," the American diplomats informed the Seychellois leaders that the Reapers would also fly over Somalia "to support ongoing counter-terrorism efforts," though not "direct attacks," according to a cable summarizing the meeting.
U.S. officials "stressed the sensitive nature of this counter-terrorism mission and that this not be released outside of the highest . . . channels," the cable stated. "The President wholeheartedly concurred with that request, noting that such issues could be politically sensitive for him as well."
The Seychelles drone operation has a relatively small footprint. Based in a hangar located about a quarter-mile from the main passenger terminal at the airport, it includes between three and four Reapers and about 100 U.S. military personnel and contractors, according to the cables.
The military operated the flights on a continuous basis until April, when it paused the operations. They resumed this month, said Stockman, the Africa Command spokesman.
The aim in assembling a constellation of bases in the Horn of Africa and the Arabian Peninsula is to create overlapping circles of surveillance in a region where al-Qaeda offshoots could emerge for years to come, U.S. officials said.
The locations "are based on potential target sets," said a senior U.S. military official. "If you look at it geographically, it makes sense — you get out a ruler and draw the distances [drones] can fly and where they take off from."
One U.S. official said that there had been discussions about putting a drone base in Ethiopia for as long as four years, but that plan was delayed because "the Ethiopians were not all that jazzed." Other officials said Ethiopia has become a valued counterterrorism partner because of threats posed by al-Shabab.
"We have a lot of interesting cooperation and arrangements with the Ethiopians when it comes to intelligence collection and linguistic capabilities," said a former senior U.S. military official familiar with special operations missions in the region.
An Ethiopian Embassy spokesman in Washington could not be reached for comment Tuesday night.
The former official said the United States relies on Ethiopian linguists to translate signals intercepts gathered by U.S. agencies monitoring calls and e-mails of al-Shabab members. The CIA and other agencies also employ Ethiopian informants who gather information from across the border.
Overall, officials said, the cluster of bases reflects an effort to have wider geographic coverage, greater leverage with countries in the region and backup facilities if individual airstrips are forced to close.
"It's a conscious recognition that those are the hot spots developing right now," said the former senior U.S. military official.
Companion article.
QuoteA future for drones: Automated killing
By Peter Finn, Published: September 19
One afternoon last fall at Fort Benning, Ga., two model-size planes took off, climbed to 800 and 1,000 feet, and began criss-crossing the military base in search of an orange, green and blue tarp.
The automated, unpiloted planes worked on their own, with no human guidance, no hand on any control.
After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look.
Target confirmed.
This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial "Terminators," minus beefcake and time travel.
The Fort Benning tarp "is a rather simple target, but think of it as a surrogate," said Charles E. Pippin, a scientist at the Georgia Tech Research Institute, which developed the software to run the demonstration. "You can imagine real-time scenarios where you have 10 of these things up in the air and something is happening on the ground and you don't have time for a human to say, 'I need you to do these tasks.' It needs to happen faster than that."
The demonstration laid the groundwork for scientific advances that would allow drones to search for a human target and then make an identification based on facial-recognition or other software. Once a match was made, a drone could launch a missile to kill the target.
Military systems with some degree of autonomy — such as robotic, weaponized sentries — have been deployed in the demilitarized zone between South and North Korea and other potential battle areas. Researchers are uncertain how soon machines capable of collaborating and adapting intelligently in battlefield conditions will come online. It could take one or two decades, or longer. The U.S. military is funding numerous research projects on autonomy to develop machines that will perform some dull or dangerous tasks and to maintain its advantage over potential adversaries who are also working on such systems.
The killing of terrorism suspects and insurgents by armed drones, controlled by pilots sitting in bases thousands of miles away in the western United States, has prompted criticism that the technology makes war too antiseptic. Questions also have been raised about the legality of drone strikes when employed in places such as Pakistan, Yemen and Somalia, which are not at war with the United States. This debate will only intensify as technological advances enable what experts call lethal autonomy.
The prospect of machines able to perceive, reason and act in unscripted environments presents a challenge to the current understanding of international humanitarian law. The Geneva Conventions require belligerents to use discrimination and proportionality, standards that would demand that machines distinguish among enemy combatants, surrendering troops and civilians.
"The deployment of such systems would reflect a paradigm shift and a major qualitative change in the conduct of hostilities," Jakob Kellenberger, president of the International Committee of the Red Cross, said at a conference in Italy this month. "It would also raise a range of fundamental legal, ethical and societal issues, which need to be considered before such systems are developed or deployed."
Drones flying over Afghanistan, Pakistan and Yemen can already move automatically from point to point, and it is unclear what surveillance or other tasks, if any, they perform while in autonomous mode. Even when directly linked to human operators, these machines are producing so much data that processors are sifting the material to suggest targets, or at least objects of interest. That trend toward greater autonomy will only increase as the U.S. military shifts from one pilot remotely flying a drone to one pilot remotely managing several drones at once.
But humans still make the decision to fire, and in the case of CIA strikes in Pakistan, that call rests with the director of the agency. In future operations, if drones are deployed against a sophisticated enemy, there may be much less time for deliberation and a greater need for machines that can function on their own.
The U.S. military has begun to grapple with the implications of emerging technologies.
"Authorizing a machine to make lethal combat decisions is contingent upon political and military leaders resolving legal and ethical questions," according to an Air Force treatise called Unmanned Aircraft Systems Flight Plan 2009-2047. "These include the appropriateness of machines having this ability, under what circumstances it should be employed, where responsibility for mistakes lies and what limitations should be placed upon the autonomy of such systems."
In the future, micro-drones will reconnoiter tunnels and buildings, robotic mules will haul equipment and mobile systems will retrieve the wounded while under fire. Technology will save lives. But the trajectory of military research has led to calls for an arms-control regime to forestall any possibility that autonomous systems could target humans.
In Berlin last year, a group of robotic engineers, philosophers and human rights activists formed the International Committee for Robot Arms Control (ICRAC) and said such technologies might tempt policymakers to think war can be less bloody.
Some experts also worry that hostile states or terrorist organizations could hack robotic systems and redirect them. Malfunctions also are a problem: In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.
The ICRAC would like to see an international treaty, such as the one banning antipersonnel mines, that would outlaw some autonomous lethal machines. Such an agreement could still allow automated antimissile systems.
"The question is whether systems are capable of discrimination," said Peter Asaro, a founder of the ICRAC and a professor at the New School in New York who teaches a course on digital war. "The good technology is far off, but technology that doesn't work well is already out there. The worry is that these systems are going to be pushed out too soon, and they make a lot of mistakes, and those mistakes are going to be atrocities."
Research into autonomy, some of it classified, is racing ahead at universities and research centers in the United States, and that effort is beginning to be replicated in other countries, particularly China.
"Lethal autonomy is inevitable," said Ronald C. Arkin, the author of "Governing Lethal Behavior in Autonomous Robots," a study that was funded by the Army Research Office.
Arkin believes it is possible to build ethical military drones and robots, capable of using deadly force while programmed to adhere to international humanitarian law and the rules of engagement. He said software can be created that would lead machines to return fire with proportionality, minimize collateral damage, recognize surrender, and, in the case of uncertainty, maneuver to reassess or wait for a human assessment.
In other words, rules as understood by humans can be converted into algorithms followed by machines for all kinds of actions on the battlefield.
"How a war-fighting unit may think — we are trying to make our systems behave like that," said Lora G. Weiss, chief scientist at the Georgia Tech Research Institute.
Others, however, remain skeptical that humans can be taken out of the loop.
"Autonomy is really the Achilles' heel of robotics," said Johann Borenstein, head of the Mobile Robotics Lab at the University of Michigan. "There is a lot of work being done, and still we haven't gotten to a point where the smallest amount of autonomy is being used in the military field. All robots in the military are remote-controlled. How does that sit with the fact that autonomy has been worked on at universities and companies for well over 20 years?"
Borenstein said human skills will remain critical in battle far into the future.
"The foremost of all skills is common sense," he said. "Robots don't have common sense and won't have common sense in the next 50 years, or however long one might want to guess."
I've just written an article about the ethics of armed drones, including the increased use of AI. I could share but then I'd have to kill you. Suffice it to say the battlefield is too complex an environment to use AI effectively and insurgents will find a way non-identifiable or surrounded by human shields.
"You teach it that a small human carrying a ball is an invalid target and the insurgents will use small agents with ball-shaped bombs."
I also got slapped down by a Wing Commander yesterday for using the term "drones". The correct term is "remotely piloted aircraft".
Quote from: Brazen on September 21, 2011, 06:24:10 AM
I also got slapped down by a Wing Commander yesterday for using the term "drones". The correct term is "remotely piloted aircraft".
:hmm:
Quote from: The Brain on September 21, 2011, 06:26:10 AM
:hmm:
Some surveillance drones can be fully automated, but armed drones need a "man-in-the-loop" with ultimate responsibility for launching a lethal attack.
Quote from: Brazen on September 21, 2011, 06:28:36 AM
Quote from: The Brain on September 21, 2011, 06:26:10 AM
:hmm:
Some surveillance drones can be fully automated, but armed drones need a "man-in-the-loop" with ultimate responsibility for launching a lethal attack.
Today.
Quote from: The Brain on September 21, 2011, 06:31:51 AM
Quote from: Brazen on September 21, 2011, 06:28:36 AM
Quote from: The Brain on September 21, 2011, 06:26:10 AM
:hmm:
Some surveillance drones can be fully automated, but armed drones need a "man-in-the-loop" with ultimate responsibility for launching a lethal attack.
Today.
Forever.
So, if Obama is King of Drones.. who is the Bee Queen?
V
Quote from: The Brain on September 21, 2011, 07:02:30 AM
Quote from: CountDeMoney on September 21, 2011, 06:49:53 AM
Quote from: The Brain on September 21, 2011, 06:31:51 AM
Quote from: Brazen on September 21, 2011, 06:28:36 AM
Quote from: The Brain on September 21, 2011, 06:26:10 AM
:hmm:
Some surveillance drones can be fully automated, but armed drones need a "man-in-the-loop" with ultimate responsibility for launching a lethal attack.
Today.
Forever.
Luddite.
Geneva Convention, so unlikely to change any time soon. The responsibility will always end up with a human, even if it's the commander who ordered "go to autokill mode".
Quote from: The Brain on September 21, 2011, 06:31:51 AM
Quote from: Brazen on September 21, 2011, 06:28:36 AM
Quote from: The Brain on September 21, 2011, 06:26:10 AM
:hmm:
Some surveillance drones can be fully automated, but armed drones need a "man-in-the-loop" with ultimate responsibility for launching a lethal attack.
Today.
If we don't have humans controlling the drones, how will we know whom to drag to the execution chamber when they're shot down? Should we just pick someone at random? :huh:
In a perfect world, drones would roam European airspace and every time a Euro begins posting a lecture on Americans or American laws on a message board, a hellfire flies through their window.
Sigh.
Quote from: Ed Anger on September 21, 2011, 09:46:32 AM
In a perfect world, drones would roam European airspace and every time a Euro begins posting a lecture on Americans or American laws on a message board, a hellfire flies through their window.
Sigh.
But us Canucks can still criticize you at will due to our more valuable dollar, right? :)
Quote from: Barrister on September 21, 2011, 09:48:10 AM
Quote from: Ed Anger on September 21, 2011, 09:46:32 AM
In a perfect world, drones would roam European airspace and every time a Euro begins posting a lecture on Americans or American laws on a message board, a hellfire flies through their window.
Sigh.
But us Canucks can still criticize you at will due to our valuable oil tar sands, right? :)
Fixed. Nobody cares about the Loonie.
Drones aren't cleared for most of European airspace, there are only dedicated corridors. Fortunately there was one straight out of Sicily and down to Libya.
Quote from: The Brain on September 21, 2011, 06:31:51 AM
Quote from: Brazen on September 21, 2011, 06:28:36 AM
Quote from: The Brain on September 21, 2011, 06:26:10 AM
:hmm:
Some surveillance drones can be fully automated, but armed drones need a "man-in-the-loop" with ultimate responsibility for launching a lethal attack.
Today.
Skynet sympathizer! :mad:
So when Bush sets up secret bases to fight terrorism in Africa, he is the greatest evil since Hitler, but when Obama does it, then that is OSSUM!
Figures!
Quote from: Brazen on September 21, 2011, 09:51:10 AM
Drones aren't cleared for most of European airspace, there are only dedicated corridors. Fortunately there was one straight out of Sicily and down to Libya.
:frusty:
Quote from: Ed Anger on September 21, 2011, 09:49:21 AM
Quote from: Barrister on September 21, 2011, 09:48:10 AM
Quote from: Ed Anger on September 21, 2011, 09:46:32 AM
In a perfect world, drones would roam European airspace and every time a Euro begins posting a lecture on Americans or American laws on a message board, a hellfire flies through their window.
Sigh.
But us Canucks can still criticize you at will due to our valuable oil tar sands, right? :)
Fixed. Nobody cares about the Loonie.
Amendmend cheerfully accepted. :cool:
Quote from: Brazen on September 21, 2011, 06:24:10 AM
I've just written an article about the ethics of armed drones, including the increased use of AI. I could share but then I'd have to kill you. Suffice it to say the battlefield is too complex an environment to use AI effectively and insurgents will find a way non-identifiable or surrounded by human shields.
"You teach it that a small human carrying a ball is an invalid target and the insurgents will use small agents with ball-shaped bombs."
I also got slapped down by a Wing Commander yesterday for using the term "drones". The correct term is "remotely piloted aircraft".
:lol: Yes, military people have their own jargon and stick to it - unless there's a faddish new term to exploit.
Quote from: Berkut on September 21, 2011, 09:56:21 AM
So when Bush sets up secret bases to fight terrorism in Africa, he is the greatest evil since Hitler, but when Obama does it, then that is OSSUM!
Sez who?
Not the article I mentioned, but a spare interview that arrived too late with Robot Wars guru and fanatical anti AI in weapons proponent Dr Noel Sharkey.
QuoteCOMMENT – Dr Noel Sharkey on the ethics of armed drones
Noel Sharkey is a Professor of Artificial Intelligence and Robotics at Sheffield University in the UK. He is a founding member of the International Committee for Robot Arms Control and speaks on the need for international discussions to limit the potential threat to humanity caused by increasingly autonomous arms robotic military systems.
Brazen: Tell us about your work with the International Committee for Robot Arms Control (ICRAC).
Noel Sharkey: ICRAC began at my house in Sheffield in September, 2009 with a small international group meeting for a three-day discussion on the problem with armed drones and with a particular concern for the move towards autonomous armed drones. One of our biggest concerns was that there was massive proliferation – I have traced 51 countries using the technology – and yet there was absolutely no international discussion about the future use of the weapons and how will they interact, and so on.
We had our first international workshop in Berlin on September 22, 2010. The committee joined in discussion with government officials, representatives of international human rights organisations, arms control experts, philosophers, scientists and engineers from a number of countries including the USA, UK, France, Germany, Austria, the Netherlands and Australia.
We are now planning our next meeting, which will probably be in New York.
B: What are your main concerns about the increasing use of armed drones?
NS: My biggest concern is the drive towards autonomous armed robots on the ground, in the air and at sea. In particular, I am worried about the use of lethal targeting. It has almost become a mantra now in places to say that there will always be a human somewhere in the loop. But where in the loop is very important. There should always be a person responsible for making lethal targeting decisions; a robot cannot be held accountable.
This concerns me for a number of reasons. The main one is the protection of civilians, innocents and children. There are no artificial intelligence (AI) systems that can discriminate between combatants and non-combatants or others with immunity. It is not just because current visual and sensing systems are not up to the job, though I cannot see them being so in the next 20 or 30 years. Even if the systems were up to the job, deciding who to kill and when it is appropriate to kill require reasoning and battlefield awareness that is well outside of what AI systems can do. And there is no evidence to suggest when, or if, they will ever be able to do this.
In other words, we are heading towards mobile indiscriminate weapons that can get anywhere on earth quickly. When all the major nations have them, and they will, there is no telling what the consequences will be of the different complex algorithms interacting. They will all be kept secret.
Other major concerns are that having such weapons may lower the bar for going to war; the lack of body-bag count lowers the risks, so wars may be triggered more easily. We can see the beginning of this with top Obama lawyer, Harold Koh. *
B: I understand from articles you have written that you believe using an armed drone is less ethical than a pilot shooting a target. Why is that?
NS: The question is put too simplistically.
There are a number of different issues here. First, the conventional forces used remote-piloted drones for air strikes in countries where the International Force are in conflicts: Iraq and Pakistan. This not greatly different from using cruise missiles except that their perceived accuracy allows expansion of the battlespace into urban areas. Despite the claims of great accuracy, many civilians are being killed.
Second, there are the CIA-targeted killings in Pakistan, Somalia and the Yemen. In these cases alleged insurgent leaders are killed along with anyone near them. Philip Alston, who was UN special rapporteur for extrajudicial killings, has said that the strikes are illegal under International Humanitarian Law (IHL) because there is no due process and no chance of a hearing or a trail. Other methods of capture are not attempted. There is no accountability and no rationale is given for the target selection.
Then there are the new autonomous drones. These should not be allowed to make decisions about lethal targeting as they are indiscriminate and could be disproportionate. They would be OK to use in a military-rich environment such as the WWII battlefields, but in the current type of urban conflict many civilians could die.
B: The use of armed drones is governed by international military convention. What do you think is missing from current guidelines that could be done better?
NS: I am not an international lawyer but there are no specific rules governing the use of drones. Current IHL appears to have the instruments necessary to control drone use, but there is some difficulty in mapping how drones are used against IHL dictates.
One of the problems is that there has been no international discussion between nation states about the massive proliferation, and I have tracked 51 countries using them. It seems OK now with Israel, the US and their allies using them in permissive air space. But China is now discussing selling armed drones, and when more sophisticated countries have their own, we might see some of the precedents being set up by the US come back to bite us.
B: You make a distinction between human-in-the-loop and human-on-the-loop. What's the difference?
NS: The idea of a human-on-the-loop means that the human will be in executive control overall to call in or call off the robots, rather than being in control of each individually. In other words, the robots will be essentially autonomous.
The USAF Flight Plan 2009–2032 says on page 39: "SWARM technology will allow multiple MQ-Mb aircraft to cooperatively operate in a variety of lethal and non-lethal missions at the command of a single pilot." Such a move would require decisions being made by the swarm – human decision-making would be too slow and not able to react to the control of several aircraft at once. With the increasing pace of the action and with the potential of several aircraft to choose targets at the same time, it will not be possible to have the human make all the decisions to kill.
B: Aside from the issues of lethality and the risk of hitting the wrong targets, do you think moral issues such as the pilots not putting themselves in danger or the fact the enemy seldom has access to equivalent technology are valid?
NS: This is a very difficult question. There have always been moral issues about distance targeting using artillery and carpet bombing; remote controlling a weapon at a distance is not much different. The moral questions arise in how the weapon is used. If being at a distance creates a moral buffer that makes the operator less careful about targeting, then there is a problem.
The real issue about being able to hover an aircraft for up to 26 hours, a duration which is increasing, without a person on board is how it changes the nature of warfare both in terms of the use of the drones and how the enemy is forced to fight back. Such extreme asymmetry will surely simply increase terrorist attacks and make the world a less secure place to live in.
* Koh argued strongly in a March 2010 speech for the legality of targeted killing by aerial drone strikes in Pakistan, Yemen and other countries included by the US government as being within the scope of the war on terror. (Wikipedia)
I can't really argue with much of his concerns, although I think he is committing the error of making assumptions about how things will be, then arguing against those things.
For example:
QuoteSuch extreme asymmetry will surely simply increase terrorist attacks and make the world a less secure place to live in.
Uhh, how can you possibly know that? I mean, I suppose that is a possible result, but it is one possibilty among many, and there is no reason to presume that it is the likely outcome.
We should be talking about these issues though. Remotely controlled weapons are here to stay, and he is right that there is very little agreed upon protocols for how to use them.
Quote from: Brazen on September 22, 2011, 11:07:17 AM... do you think moral issues such as the pilots not putting themselves in danger or the fact the enemy seldom has access to equivalent technology are valid?
Short answer: No.
Long answer: Depends on your morality. ;)
The college I teach at is going to offer a UAV course.
So sleep well at night. The drones are gonna be flown by Community College grads.
Quote from: Ed Anger on September 22, 2011, 12:57:24 PM
The college I teach at is going to offer a UAV course.
So sleep well at night. The drones are gonna be flown by Community College grads.
Twitchy bastards have grown up on Call of Duty. They'll be fine.
Officer: You hit a school.
"pilot": AWESOME DUDE!
Quote from: citizen k on September 22, 2011, 12:54:24 PM
Quote from: Brazen on September 22, 2011, 11:07:17 AM... do you think moral issues such as the pilots not putting themselves in danger or the fact the enemy seldom has access to equivalent technology are valid?
Short answer: No.
Long answer: Depends on your morality. ;)
The guy brings up WWII as a military rich environment. I think he should be reminded that during the war whole cities were blasted to rubble to hit just one target.
Whole cities were blasted to rubble in order to blast cities into rubble.
Quote from: AnchorClanker on September 22, 2011, 04:10:17 PM
Whole cities were blasted to rubble in order to blast cities into rubble.
Well that as well. Toward the end whole cities were blasted into rubble to give the air corps something to do.
Quote from: Ed Anger on September 22, 2011, 12:57:24 PM
The college I teach at is going to offer a UAV course.
So sleep well at night. The drones are gonna be flown by Community College grads.
Considering they're being flown by high school grads now, having an Intro to Western Civ under their belt should provide more well-roundedness to the CAP circles.
Unarmed models, but not the armed ones. You need an officer to have trigger authority.
Maybe I can parley my AAS degree into some sweet, sweet trigger time.
Quote from: Ed Anger on September 22, 2011, 07:42:12 PM
Maybe I can parley my AAS degree into some sweet, sweet trigger time.
:lol: Maybe. I've heard Army warrants are flying some as well... for the same reasons that they are flying the choppers.
Have commissioned officer status, cheaper than LTs. :lol:
I think the USAF allows NCOs to fly the unarmed versions, but if it's packin' a Hellfire, it's an officer's flight.
Euro: I just watched a documentary on channel 3 and you Americans and your....what is that noise? *boom*
Ed the headless Thompson gunner: DIE DIE DIE DIE DIE! Where are the powerups?
Quote from: CountDeMoney on September 22, 2011, 07:09:08 PM
Considering they're being flown by high school grads now, having an Intro to Western Civ under their belt should provide more well-roundedness to the CAP circles.
Western Civ is only for the officers.
Quote from: AnchorClanker on September 22, 2011, 07:45:56 PM
Quote from: Ed Anger on September 22, 2011, 07:42:12 PM
Maybe I can parley my AAS degree into some sweet, sweet trigger time.
:lol: Maybe. I've heard Army warrants are flying some as well... for the same reasons that they are flying the choppers.
Have commissioned officer status, cheaper than LTs. :lol:
I think the USAF allows NCOs to fly the unarmed versions, but if it's packin' a Hellfire, it's an officer's flight.
As I understand it, the usual configuration is an officer pilot and enlisted sensor operator (thought about retraining into that job).
Quote from: AnchorClanker on September 22, 2011, 07:45:56 PM
:lol: Maybe. I've heard Army warrants are flying some as well... for the same reasons that they are flying the choppers.
Have commissioned officer status, cheaper than LTs. :lol:
I think the USAF allows NCOs to fly the unarmed versions, but if it's packin' a Hellfire, it's an officer's flight.
I thought the driver behind the Army putting warrants in their pilot's seats is that warrants don't have the pressure to move up the ranks, and thus out of the pilot's seat. The Army was losing quite a few chopper pilots who joined Army Aviation primarily because they wanted to fly, and as soon as they were forced to rotate into a ground job they quit to go fly civvie choppers.
Quote from: Tonitrus on September 23, 2011, 06:37:41 PM
Quote from: AnchorClanker on September 22, 2011, 07:45:56 PM
Quote from: Ed Anger on September 22, 2011, 07:42:12 PM
Maybe I can parley my AAS degree into some sweet, sweet trigger time.
:lol: Maybe. I've heard Army warrants are flying some as well... for the same reasons that they are flying the choppers.
Have commissioned officer status, cheaper than LTs. :lol:
I think the USAF allows NCOs to fly the unarmed versions, but if it's packin' a Hellfire, it's an officer's flight.
As I understand it, the usual configuration is an officer pilot and enlisted sensor operator (thought about retraining into that job).
Yes, for the USAF. Not sure about the Army.
Quote from: Baron von Schtinkenbutt on September 24, 2011, 01:29:49 PM
Quote from: AnchorClanker on September 22, 2011, 07:45:56 PM
:lol: Maybe. I've heard Army warrants are flying some as well... for the same reasons that they are flying the choppers.
Have commissioned officer status, cheaper than LTs. :lol:
I think the USAF allows NCOs to fly the unarmed versions, but if it's packin' a Hellfire, it's an officer's flight.
I thought the driver behind the Army putting warrants in their pilot's seats is that warrants don't have the pressure to move up the ranks, and thus out of the pilot's seat. The Army was losing quite a few chopper pilots who joined Army Aviation primarily because they wanted to fly, and as soon as they were forced to rotate into a ground job they quit to go fly civvie choppers.
That's part of it, and it's also that WOs are cheaper than JOs.
Killing civilians is the future of warfare.
I can't wait for robotic infantry.
I can be a 70 year old dude in front of cozy computer leading a robotic squad into a building by remote, killing moonslimbs by scores.
I can also have a cheap kamikaze robot walking into an enemy safehouse.
We brings up the point that suicide bombers and muslims in general are nothing more than biological drones.
Mechanical drones need a computer to receive orders, muslim drones only need a guy with an speaker at the local mosq.
Not only is that never going to happen (technology won't get there, it'll be too expesnsive, and you'll never be an officer), but also the Muslims will have won long before you get that old.
Quote from: Neil on September 25, 2011, 03:59:41 PM
Not only is that never going to happen (technology won't get there, it'll be too expesnsive, and you'll never be an officer), but also the Muslims will have won long before you get that old.
You failed to address my "biological drones" argument.
Disappointing.
You're a biological drone. With no semen.
Quote from: CountDeMoney on September 25, 2011, 09:32:51 PM
You're a biological drone. With no semen.
And I'm gonna eat your stupid cat.
Alive. A leg at a time.
Too bad he ain't kosher.
You may be able to truck that shit with pederast fags in cafes trying to be noticed with their Apples, but my cat will wrap you up in her anti-Zionist attack stance--the deadly lian quan joo--flay your nutsack off and teabag you with it.
Raz status: Amused by this exchange.
Quote from: Siege on September 25, 2011, 09:30:53 PM
Quote from: Neil on September 25, 2011, 03:59:41 PM
Not only is that never going to happen (technology won't get there, it'll be too expesnsive, and you'll never be an officer), but also the Muslims will have won long before you get that old.
You failed to address my "biological drones" argument.
Disappointing.
The closest thing to a biological drone is an infantryman. Can't think, just follows commands.
Quote from: Neil on September 25, 2011, 09:54:24 PM
Quote from: Siege on September 25, 2011, 09:30:53 PM
Quote from: Neil on September 25, 2011, 03:59:41 PM
Not only is that never going to happen (technology won't get there, it'll be too expesnsive, and you'll never be an officer), but also the Muslims will have won long before you get that old.
You failed to address my "biological drones" argument.
Disappointing.
The closest thing to a biological drone is an infantryman. Can't think, just follows commands.
Your clueless, no offense however.
Drone on and on.
I was talking about the thread, your post just happened to be the one before mine.
Quote from: 11B4V on September 26, 2011, 09:24:04 AM
Quote from: Neil on September 25, 2011, 09:54:24 PM
Quote from: Siege on September 25, 2011, 09:30:53 PM
Quote from: Neil on September 25, 2011, 03:59:41 PM
Not only is that never going to happen (technology won't get there, it'll be too expesnsive, and you'll never be an officer), but also the Muslims will have won long before you get that old.
You failed to address my "biological drones" argument.
Disappointing.
The closest thing to a biological drone is an infantryman. Can't think, just follows commands.
Your clueless, no offense however.
None taken. There's nothing wrong with holding the infantry in contempt, especially in volunteer armies.