News:

And we're back!

Main Menu

Global military buildup

Started by Threviel, April 15, 2022, 04:53:11 AM

Previous topic - Next topic

DGuller

Quote from: Jacob on April 19, 2022, 11:57:11 AM
Quote from: Josquius on April 19, 2022, 11:53:31 AMActively choosing to blow up a convoy of refugees.

... not to mention actively chosing to blow up your own people, say because you got a misread of the armbands, becaues vehicle markings got obscured etc.
Friendly fire is a fact of life.  If you deploy artillery or bombers, you're going to kill your own people sometimes, and yet no one is considering to not use them for those reasons. 

I don't think the military shares the risk aversion expressed here.  The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100, you're going to use it as much as you can.  Unreasonable risk aversion is not a luxury you can afford in an endeavor where people die a lot.

crazy canuck

The notion that friendly fire deaths are an accepted risk if more of the enemy are generally killed is not accurate.  When Canadian soldiers were killed by friendly American fire in Afghanistan in 2002 two separate board of inquiry were held which led to changes in the way these things were planned and coordinated.

The response was definitely not to simply accept that more of the enemy is killed and so all was fine.

The Brain

Quote from: crazy canuck on April 19, 2022, 12:51:35 PMThe notion that friendly fire deaths are an accepted risk if more of the enemy are generally killed is not accurate.  When Canadian soldiers were killed by friendly American fire in Afghanistan in 2002 two separate board of inquiry were held which led to changes in the way these things were planned and coordinated.

The response was definitely not to simply accept that more of the enemy is killed and so all was fine.

FWIW I think the notion DG mentioned is a more relevant one.
Women want me. Men want to be with me.

Barrister

Quote from: DGuller on April 19, 2022, 12:04:52 PMFriendly fire is a fact of life.  If you deploy artillery or bombers, you're going to kill your own people sometimes, and yet no one is considering to not use them for those reasons. 

I don't think the military shares the risk aversion expressed here.  The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100, you're going to use it as much as you can.  Unreasonable risk aversion is not a luxury you can afford in an endeavor where people die a lot.

While I understand your point, I very much doubt that western militaries would find a 10:1 ratio of extra enemy casualties to friendly fire casualties acceptable.

And are there any autonomous military drones that are completely autonomous in terms of firing?  Do they still not all require a human to "pull the trigger" first?
Posts here are my own private opinions.  I do not speak for my employer.

Zanza

As we just see in Ukraine, at least the Russians do not care for collateral damage among civilians. Indeed, they purposefully seem to inflict it. In such a scenario, a drone misfiring occasionally seems completely irrelevant.

The Brain

Quote from: Barrister on April 19, 2022, 01:08:05 PM
Quote from: DGuller on April 19, 2022, 12:04:52 PMFriendly fire is a fact of life.  If you deploy artillery or bombers, you're going to kill your own people sometimes, and yet no one is considering to not use them for those reasons. 

I don't think the military shares the risk aversion expressed here.  The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100, you're going to use it as much as you can.  Unreasonable risk aversion is not a luxury you can afford in an endeavor where people die a lot.

While I understand your point, I very much doubt that western militaries would find a 10:1 ratio of extra enemy casualties to friendly fire casualties acceptable.

And are there any autonomous military drones that are completely autonomous in terms of firing?  Do they still not all require a human to "pull the trigger" first?

Don't people read anymore?
Women want me. Men want to be with me.

Jacob

Quote from: DGuller on April 19, 2022, 12:04:52 PMFriendly fire is a fact of life.  If you deploy artillery or bombers, you're going to kill your own people sometimes, and yet no one is considering to not use them for those reasons.

I don't think the military shares the risk aversion expressed here.  The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100, you're going to use it as much as you can.  Unreasonable risk aversion is not a luxury you can afford in an endeavor where people die a lot.

The argument is not that the military is particularly averse or not averse to friendly fire incidents to achieve whatever goal. The argument is that the "ground complexity" in combat operations is orders of magnitudes more complex than it is for landing commercial aircraft.

Jacob

Quote from: Zanza on April 19, 2022, 01:11:39 PMAs we just see in Ukraine, at least the Russians do not care for collateral damage among civilians. Indeed, they purposefully seem to inflict it. In such a scenario, a drone misfiring occasionally seems completely irrelevant.

Indeed.

However, they're not - as I understand it - on the cutting edges of AI or drone technology.

Zanza

There are existing IFF mechanisms that would used to decrease the likelihood of friendly fire losses from autonomous drones.

In general, it's just a question of threshold for the human programmer of the drone. Let the sensors and AI calculate a probability that whatever it detects need to be fought and then set the threshold of what magnitude of error you are willing to accept. Democratic militaries would likely set this higher, but also have better sensors/AI, authoritarian militaries might set it lower as they might be less averse to collateral damage.

Zanza

Quote from: Jacob on April 19, 2022, 01:17:14 PM
Quote from: Zanza on April 19, 2022, 01:11:39 PMAs we just see in Ukraine, at least the Russians do not care for collateral damage among civilians. Indeed, they purposefully seem to inflict it. In such a scenario, a drone misfiring occasionally seems completely irrelevant.

Indeed.

However, they're not - as I understand it - on the cutting edges of AI and drone technology.
No, but China is.

Jacob

#100
A couple of additional thoughts:

1) I don't think the challenges of the complexities of the battle space - including the difficulties in distinguishing between friendlies, hostiles, and civilians - is not just about finding the acceptable ratio of enemie casualties to friendly fire ones. It is also about the vulnerability to being actively spoofed or baffled.

So continuing from that thought, I suppose autonomous AI drones may be easier to implement in, say, a naval context where there are fewer actors and elements compared to fighting on land.

2) I believe that one of the current lessons from the Ukrainian-Russian war is that the Western way of war is superior to the Russian one due to a highly professional core of NCOs able to make autonomous decisions, able to understand the large tactical and strategic objectives (make decisions two levels above their rank, I believe people are saying), and to act independently on their own initiative. This type of decision making is even harder to program competently compared to even "fly around and kill all the enemies you see, according to this list of priority targets, while avoiding killing civilians or our own people."

Berkut

Quote from: Barrister on April 19, 2022, 01:08:05 PM
Quote from: DGuller on April 19, 2022, 12:04:52 PMFriendly fire is a fact of life.  If you deploy artillery or bombers, you're going to kill your own people sometimes, and yet no one is considering to not use them for those reasons. 

I don't think the military shares the risk aversion expressed here.  The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100, you're going to use it as much as you can.  Unreasonable risk aversion is not a luxury you can afford in an endeavor where people die a lot.

While I understand your point, I very much doubt that western militaries would find a 10:1 ratio of extra enemy casualties to friendly fire casualties acceptable.

" The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100"

He is saying that if you can PREVENT 1000 friends getting kills at the cost of 100 friendly fire casualties, you should do that.
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

Jacob

Similar to the potential lower bar for autonomous AI for surface (or sub-surface) naval warfare compared to ground warfare, I suppose there may be a lower bar of entry for autonomous AI for contesting air supremacy.

crazy canuck

Quote from: Berkut on April 19, 2022, 01:33:56 PM
Quote from: Barrister on April 19, 2022, 01:08:05 PM
Quote from: DGuller on April 19, 2022, 12:04:52 PMFriendly fire is a fact of life.  If you deploy artillery or bombers, you're going to kill your own people sometimes, and yet no one is considering to not use them for those reasons. 

I don't think the military shares the risk aversion expressed here.  The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100, you're going to use it as much as you can.  Unreasonable risk aversion is not a luxury you can afford in an endeavor where people die a lot.

While I understand your point, I very much doubt that western militaries would find a 10:1 ratio of extra enemy casualties to friendly fire casualties acceptable.

" The cold reality is that if you have a weapons system that prevents 1000 kills of your own by the enemy at the cost of increasing friendly fire kills by 100"

He is saying that if you can PREVENT 1000 friends getting kills at the cost of 100 friendly fire casualties, you should do that.

Yes, but that reasoning is fallacious if it is possible to bring down the friendly kills to a lower number.  Just as the military now attempts to do.  No one plans an operation which accepts that a 10% friendly fire kill rate is possible.  Quite the contrary, friendly fire events are avoided as much as possible.

Jacob

Quote from: Zanza on April 19, 2022, 01:17:47 PMNo, but China is.

I understand they're purchasing a fair bit of cutting edge stuff from the US, yes.

But fair point, China is trying to go all in on AI and they may be willing to accept a higher ratio of friendly fire and civilian casualties than the West.

Still, I think the problem of complexity is not so much about kill ratios (though that shouldn't be discounted as an issue) but about vulnerability and efficacy. Highly complex dynamic environments with many edge cases create the risk of non-desired behaviour as well as vulnerabilities to exploits (as players of many online games can attest to). This, of course, can be iterated through, but that too is a non-trivial task.