News:

And we're back!

Main Menu

Elon Musk: Always A Douche

Started by garbon, July 15, 2018, 07:01:42 PM

Previous topic - Next topic

crazy canuck

Quote from: Zanza on January 04, 2024, 08:21:28 AM
Quote from: crazy canuck on January 04, 2024, 07:58:02 AM
Quote from: Zanza on January 03, 2024, 08:06:18 PMNo idea whether we will reach level 5 autonomous driving capability, but level 4 on highway looks realistic in the next years. If the regulatory framework on driving times for commercial drivers reflects this, the on highway heavy truck traffic has a business case for even expensive autonomous driving capability. If it means your truck is on the road close to 24 hours instead of maybe half of that now, it will pay for itself.

Freightliner and its subsidiary Torc in North America look very promising there. Also a cooperation between Freightliner and Waymo (Google).

It's not really a question of economic feasibility.  If that were the only metric the tech is close as you say.  The main question is safety.
Sure, but I predict that will eventually help adoption.

The most basic autonomous driving safety feature, name autonomous emergency braking for collision prevention, is already mandatory by law in the EU and by consensual agreement with NHTSA in the US. Same with other mechanisms that override human errors like ESC.

In the end, machines are better drivers. The problem with autonomous driving is not the other autonomous vehicles, but human drivers.

Outlaw those and it becomes perfectly safe. :P 


You're not playing fair. It's easy to design an autonomous braking system that prevents accidents. It's much more difficult to design a system which drives a vehicle without human assistance. That is where the safety factor comes in and that is the problem that is currently insurmountable, and as I understand, it is something that would require the development of technology that currently does not exist.

Zanza

Which case of self-driving is more critical from a safety perspective than emergency braking? That's the last line of defense after all before people are potentially harmed. Any other driving situation is by definition less critical. If it wasn't, you would be back at emergency braking because that's always the best solution in any dangerous situation.

Cars know the way, they can accelerate, decelerate, switch gears (if they have any), change lanes, signal (ok, might be hard for BMW), turn around corners, stop at traffic lights or zebra crossings, etc.

And of course the rate of error of human drivers would be unacceptable for a machine. It has to be much better to be accepted, which is part of the challenge. Building something that drives like the average human (or the lowest percentile of drivers  :o ) is probably feasible with current technology.

What's stopping autonomous driving is legal liability or regulatory constraints as well as reputation considerations. Plus the cost of the technology.

DGuller

Quote from: Zanza on January 04, 2024, 11:18:57 AMIf it wasn't, you would be back at emergency braking because that's always the best solution in any dangerous situation.
Is emergency braking really the best solution in every single situation?  I've always heard it, but whenever I challenged it, I never got what I thought was an answer that wasn't rooted back to "driver's ed".  I get why driver's ed would over-simplify the instructions that people would apply at most several most stressful times in their lives, but I don't think it makes sense for computers.

I get why emergency braking is the best course of action if suddenly a perpendicular wall of infinite width appears in front of you on the road.  Hard braking would take half as much distance as trying to make a 90 degree turn, assuming your tires have the same grip in longitudinal and lateral direction.  However, what if a small object appears that is currently on the path of your left-front tire?  Wouldn't it be safer to steer slightly right than to get hard on the brakes?  In driver's ed swerving is discouraged, but I think it's discouraged for human factors, not because it doesn't work physics-wise.  I don't think those reasons apply to automated systems.

Where am I wrong?

Zanza

Swerving is much less risky if you combine it with braking hard to reduce your speed.

Other than fleeing from a T-Rex or a tsunami/lava, basically every dangerous situation while driving gets less dangerous by reducing your speed.

Especially when you consider that even now brake assistance systems will consider following cars when applying brake power to try to avoid collision.

You should of course not randomly stop, especially on a highway. But that's again true for both human and AI driversband works with current adaptive cruise control.

DGuller

Quote from: Zanza on January 04, 2024, 12:00:19 PMSwerving is much less risky if you combine it with braking hard to reduce your speed.
Is it always, though?  I haven't done the math, but I imagine that if the object you need to avoid is close enough, braking might cross you over from being able to steer around the obstacle to not being able to steer around it.  The grip you spend on braking is the grip you can't apply to steering.

Jacob

#3230
@Zanza - you made a joke about other drivers - but what about pedestrians (children and adults), animals, non-motorized vehicles, micro-mobility devices etc? How big a challenge do they pose?

Zanza

Quote from: DGuller on January 04, 2024, 12:10:02 PM
Quote from: Zanza on January 04, 2024, 12:00:19 PMSwerving is much less risky if you combine it with braking hard to reduce your speed.
Is it always, though?  I haven't done the math, but I imagine that if the object you need to avoid is close enough, braking might cross you over from being able to steer around the obstacle to not being able to steer around it.  The grip you spend on braking is the grip you can't apply to steering.
I am sure there is some edge case where swerving is better. But as a general rule reduced speed = reduced danger is true.

crazy canuck

Quote from: Zanza on January 04, 2024, 11:18:57 AMWhich case of self-driving is more critical from a safety perspective than emergency braking? That's the last line of defense after all before people are potentially harmed. Any other driving situation is by definition less critical. If it wasn't, you would be back at emergency braking because that's always the best solution in any dangerous situation.

Cars know the way, they can accelerate, decelerate, switch gears (if they have any), change lanes, signal (ok, might be hard for BMW), turn around corners, stop at traffic lights or zebra crossings, etc.

And of course the rate of error of human drivers would be unacceptable for a machine. It has to be much better to be accepted, which is part of the challenge. Building something that drives like the average human (or the lowest percentile of drivers  :o ) is probably feasible with current technology.

What's stopping autonomous driving is legal liability or regulatory constraints as well as reputation considerations. Plus the cost of the technology.

Even in simulations autonomous driving AI cannot obtain anything close to an acceptable level of safety.  Why do you say that it will be more safe than human drivers one day?

Barrister

Quote from: Zanza on January 04, 2024, 11:18:57 AMWhich case of self-driving is more critical from a safety perspective than emergency braking? That's the last line of defense after all before people are potentially harmed. Any other driving situation is by definition less critical. If it wasn't, you would be back at emergency braking because that's always the best solution in any dangerous situation.

Cars know the way, they can accelerate, decelerate, switch gears (if they have any), change lanes, signal (ok, might be hard for BMW), turn around corners, stop at traffic lights or zebra crossings, etc.

And of course the rate of error of human drivers would be unacceptable for a machine. It has to be much better to be accepted, which is part of the challenge. Building something that drives like the average human (or the lowest percentile of drivers  :o ) is probably feasible with current technology.

What's stopping autonomous driving is legal liability or regulatory constraints as well as reputation considerations. Plus the cost of the technology.

I feel like you're confusing "most critical" with "most difficult".

I remember listening to a Malcolm Gladwell podcast on autonomous cars.  The take away was that he loved them - because they would immediately in every situation brake for a pedestrian.  That of course is awesome for pedestrians, but could make the care practically undriveable (in particular if, like Gladwell, you're deliberately trying to goof with the car).

This  biggest problem with "defaulting to braking" I can imagine is on an highway, where perhaps the car just can't tell where it's going.  The worst thing you could do in such a situation is immediately stop.
Posts here are my own private opinions.  I do not speak for my employer.

Zanza

@CC: Not sure what you are referring to. Waymo for example seems to have acceptable levels of safety. Not in simulation, bit in real traffic on American roads. In 2023. Over millions miles driven.

But ok, I am not here to convince anybody. If you prefer to believe that it is impossible to make a safe autonomous car,  I accept that.

Zanza

#3235
Quote from: Barrister on January 04, 2024, 02:03:46 PMI remember listening to a Malcolm Gladwell podcast on autonomous cars.  The take away was that he loved them - because they would immediately in every situation brake for a pedestrian.  That of course is awesome for pedestrians, but could make the care practically undriveable (in particular if, like Gladwell, you're deliberately trying to goof with the car).
Sure. But cars not driving when you try to goof with them is annoying for the passengers, but not a safety concern. If that's our biggest issue with autonomous driving I feel we can overcome that. Also not really that different from human drivers. I would also not risk running someone over who tries to goof me.

QuoteThis  biggest problem with "defaulting to braking" I can imagine is on an highway, where perhaps the car just can't tell where it's going.  The worst thing you could do in such a situation is immediately stop.
Why would the car not be able to tell where it is going? I don't understand the situation you describe here. The car has sat nav, has exact maps, has great sensors able to "see" further and more than a human.

Zanza

Quote from: Jacob on January 04, 2024, 12:11:21 PM@Zanza - you made a joke about other drivers - but what about pedestrians (children and adults), animals, non-motorized vehicles, micro-mobility devices etc? How big a challenge do they pose
Isn't that all the same challenge? Avoid hitting object that is or moves into the current driving direction.

Tamas

I am assuming self-driving cars would be eminently doable already if all cars would be switched to such at the same time. Predictable behaviour and linked communications and voila.

Jacob

Quote from: Zanza on January 04, 2024, 02:29:09 PMIsn't that all the same challenge? Avoid hitting object that is or moves into the current driving direction.

I suppose so... I imagined that pedestrians, bicyclists, small children, animals etc behave in ways that are less predictable than cars, so would add complexity.

I don't know though - maybe that's all taken care of right now. I vaguely recall reading something about how Tesla's self-driving struggled with handling bicyclists who didn't follow standard car behaviour in traffic, for example, though that was a while ago.

It was a real question, not a rhetorical one. I don't know how developed self-driving AI are for more complex traffic scenarios.

Jacob

Quote from: Tamas on January 04, 2024, 02:31:51 PMI am assuming self-driving cars would be eminently doable already if all cars would be switched to such at the same time. Predictable behaviour and linked communications and voila.

That's assuming the most challenging issue is with other cars, rather than things like children and dogs running suddenly into traffic, bicyclists not behaving like cars, some idiot deciding to ride their e-scooter down a busy street etc - or with things like poorly marked construction zones and the like.