0
gowlerk

Self driving Uber fatality

Recommended Posts

DJL

******

Quote

There's serious talk, they may have even done it, about banning the cars from any public roads until they can be proven safe. The idea was to build simulations but I have to wonder if that's even possible. Can the levels of complexity in driving be created?
It still seems like the best idea is to test them with a driver ready to take over. I know, it's not fool proof, obviously, and the problem is that people will demand so much more than they're already putting up with.
I just with they'd hurry up. I want a car that wakes me up when we get there.



I have to imagine they're using the miles and miles of test track we have all over the country. They don't even need to put a person in them, just add extra fuel capacity and put about 100 of them out there only taking breaks to refuel.


That's a good way to test for endurance but I don't think that's the problem.

You obviously have not watched an adequate quantity of Myth Busters. I'm picturing an elaborate course of crosswalks, trash bags blowing around, obstacles rising and falling, feral pigs and even a quantity of live drivers in crash-up derby cars.

Feral pigs sounds more like Top Gear.

And if you know about what I referred, then you probably are dreading the day that a self-driving car actually happens.

:(
lisa
WSCR 594
FB 1023
CBDB 9

Share this post


Link to post
Share on other sites
oldwomanc6

*********

Quote

There's serious talk, they may have even done it, about banning the cars from any public roads until they can be proven safe. The idea was to build simulations but I have to wonder if that's even possible. Can the levels of complexity in driving be created?
It still seems like the best idea is to test them with a driver ready to take over. I know, it's not fool proof, obviously, and the problem is that people will demand so much more than they're already putting up with.
I just with they'd hurry up. I want a car that wakes me up when we get there.



I have to imagine they're using the miles and miles of test track we have all over the country. They don't even need to put a person in them, just add extra fuel capacity and put about 100 of them out there only taking breaks to refuel.


That's a good way to test for endurance but I don't think that's the problem.

You obviously have not watched an adequate quantity of Myth Busters. I'm picturing an elaborate course of crosswalks, trash bags blowing around, obstacles rising and falling, feral pigs and even a quantity of live drivers in crash-up derby cars.

Feral pigs sounds more like Top Gear.

And if you know about what I referred, then you probably are dreading the day that a self-driving car actually happens.

:(

At the airport where I jump and I'm a member of the boosters (we lease it from the county for a dollar a year and take care of it ourselves) there are lots of events, some of the have been going on for years. But the one that I don't even like to think about is the weekend that the Scouts spend camping there. Lately the wild pigs keep digging up the area beside he runway. Right where they'll be camping.

Share this post


Link to post
Share on other sites
I am not at all surprised by that. And I don't really blame the 'safety guy' for doing it.

They were on a stretch of road that isn't supposed to have cross traffic. Running straight & steady. The autonomous programs handle that stuff very well. I'm going to guess that the 'human backup' would only really pay attention in places/situations where there's concern that the auto programs might miss something.

And there's virtually no way a person is going to pay close attention all the time, especially when there's no apparent reason to. Even the most vigilant will lose concentration once it's clear that the car drives itself well.
"There are NO situations which do not call for a French Maid outfit." Lucky McSwervy

"~ya don't GET old by being weak & stupid!" - Airtwardo

Share this post


Link to post
Share on other sites
Something I'm curious about:
Did Uber fully explain to the drivers that the "autonomous" system was incapable of making an emergency stop?
"There are only three things of value: younger women, faster airplanes, and bigger crocodiles" - Arthur Jones.

Share this post


Link to post
Share on other sites
billvon

>Did Uber fully explain to the drivers that the "autonomous" system was incapable of
>making an emergency stop?

Why do you think it wasn't capable of making an emergency stop?



At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

Source: https://arstechnica.com/cars/2018/05/emergency-brakes-were-disabled-by-ubers-self-driving-software-ntsb-says/
"There are only three things of value: younger women, faster airplanes, and bigger crocodiles" - Arthur Jones.

Share this post


Link to post
Share on other sites
ryoder

'Driver' of autonomous Uber was watching Hulu during fatal Arizona crash

https://www.cnet.com/google-amp/news/self-driving-uber-driver-hulu/



I was curious when or how this would pop up. In my opinion you're still the PIC so ultimately it's your fault. We'll be sure to see this guy counter suing Uber for misleading him to think that the vehicle should have detected the threat and stopped.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
DJL

***'Driver' of autonomous Uber was watching Hulu during fatal Arizona crash

https://www.cnet.com/google-amp/news/self-driving-uber-driver-hulu/



I was curious when or how this would pop up. In my opinion you're still the PIC so ultimately it's your fault. We'll be sure to see this guy counter suing Uber for misleading him to think that the vehicle should have detected the threat and stopped.

The driver was Rafaela Vasquez; A woman.
"There are only three things of value: younger women, faster airplanes, and bigger crocodiles" - Arthur Jones.

Share this post


Link to post
Share on other sites
Quote

At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.


The car's own radar-based emergency braking system was disabled. I think it's called CitySafe or something like that. It's disabled so that two different autonomous driving/ADAS systems don't try to control the car at the same time.

However, nothing I have seen indicated that the Uber system itself was incapable of braking (or maneuvering) to avoid obstacles. Indeed, the car wouldn't have lasted five minutes on a US road without that capability.

Share this post


Link to post
Share on other sites
ryoder

******'Driver' of autonomous Uber was watching Hulu during fatal Arizona crash

https://www.cnet.com/google-amp/news/self-driving-uber-driver-hulu/



I was curious when or how this would pop up. In my opinion you're still the PIC so ultimately it's your fault. We'll be sure to see this guy counter suing Uber for misleading him to think that the vehicle should have detected the threat and stopped.

The driver was Rafaela Vasquez; A woman.

And I'd be willing to bet that all of this was laid out in detail in her employment agreement. It would be interesting to know what it says.

Share this post


Link to post
Share on other sites
Bob_Church


And I'd be willing to bet that all of this was laid out in detail in her employment agreement. It would be interesting to know what it says.



And I'd like to know how she and other drivers were briefed, and how they all actually behaved -- in the real world, not just what was on a long form she signed.

I haven't gone back to check on how long she was driving, but if drivers were driving around a couple hours at night in boring areas, were they all keeping eyes on road or were a large proportion also playing with their phones a bunch of the time?

Interesting that "distracted driving" can apparently apply when someone else fails to observe right-of-way rules.
(Yeah I know one has some duty to not run over peds who step out onto the road but still.)

Share this post


Link to post
Share on other sites
pchapman

***
And I'd be willing to bet that all of this was laid out in detail in her employment agreement. It would be interesting to know what it says.



And I'd like to know how she and other drivers were briefed, and how they all actually behaved -- in the real world, not just what was on a long form she signed.

I haven't gone back to check on how long she was driving, but if drivers were driving around a couple hours at night in boring areas, were they all keeping eyes on road or were a large proportion also playing with their phones a bunch of the time?

Interesting that "distracted driving" can apparently apply when someone else fails to observe right-of-way rules.
(Yeah I know one has some duty to not run over peds who step out onto the road but still.)

I like audiobooks for long drives. They cut way down on the boredom while not being anymore distracting than listening to the radio.

Share this post


Link to post
Share on other sites
ryoder

***>Did Uber fully explain to the drivers that the "autonomous" system was incapable of
>making an emergency stop?

Why do you think it wasn't capable of making an emergency stop?



At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

Source: https://arstechnica.com/cars/2018/05/emergency-brakes-were-disabled-by-ubers-self-driving-software-ntsb-says/

I interpreted that as the emergency braking that is available on the car from the factory is disabled when the Uber AI is in control, not that the Uber AI was incapable of making an emergency stop.
Math tutoring available. Only $6! per hour! First lesson: Factorials!

Share this post


Link to post
Share on other sites
jcd11235

***
At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

Source: https://arstechnica.com/cars/2018/05/emergency-brakes-were-disabled-by-ubers-self-driving-software-ntsb-says/



I interpreted that as the emergency braking that is available on the car from the factory is disabled when the Uber AI is in control, not that the Uber AI was incapable of making an emergency stop.

"The vehicle operator is relied on to intervene and take action."

The quote was directly from the NTSB report.
I don't see any ambiguity in it.

NTSB prelim: https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx
"There are only three things of value: younger women, faster airplanes, and bigger crocodiles" - Arthur Jones.

Share this post


Link to post
Share on other sites
>"The vehicle operator is relied on to intervene and take action."

>The quote was directly from the NTSB report.
>I don't see any ambiguity in it.

Agreed.

But some aircraft are capable of landing in zero-zero conditions. And if an aircraft crashed during a cat IIIc approach due to an autothrottle failure, the NTSB would still say something like "the pilot is relied upon to intervene and take action" - even if the aircraft is normally capable of doing that completely on its own.

Share this post


Link to post
Share on other sites
We’re at the intersection where automation that significantly improves safety isn’t perfect. Overall, fewer people would die in accidents with automotive automation. But some that wouldn’t have, will. Just as some folks are hurt by seat belts, and some folks are hurt by vaccines.

Wendy P.
There is nothing more dangerous than breaking a basic safety rule and getting away with it. It removes fear of the consequences and builds false confidence. (tbrown)

Share this post


Link to post
Share on other sites
The numbers of those hurt by seatbelts and vaccines vs those helped by them is slanted immensely in favor of using them (both belts and shots).

There's virtually no intelligent debate about this.

I don't think we are there yet with autonomous cars.

But I'd bet that the autonomous cars are less dangerous than humans, on average.
"There are NO situations which do not call for a French Maid outfit." Lucky McSwervy

"~ya don't GET old by being weak & stupid!" - Airtwardo

Share this post


Link to post
Share on other sites
Quote

We’re at the intersection where automation that significantly improves safety isn’t perfect. Overall, fewer people would die in accidents with automotive automation. But some that wouldn’t have, will. Just as some folks are hurt by seat belts, and some folks are hurt by vaccines.


Yep. However, we will hold autonomous vehicles to a much higher standard than humans. In other words, we could get to the point where, if everyone used autonomous vehicles, then yearly road fatality rates would drop by 90% - but we will still consider them less safe, because "did you hear about when XXX car killed its driver? Do you want to be that guy?"

There was another thread talking about the most common biases, and this is the "neglecting probability" bias. People worry more about dying from a terrorist attack than dying from heart disease, even though heart disease is many, many orders of magnitude more likely to kill you. Because terrorist attacks are scary and get a lot of news coverage.

Share this post


Link to post
Share on other sites
wolfriverjoe


But I'd bet that the autonomous cars are less dangerous than humans, on average.



interesting article about the accident rate of waymo cars, both at-fault and not at fault. https://www.huffingtonpost.com/entry/how-safe-are-self-driving-cars_us_5908ba48e4b03b105b44bc6b

from the article:
Typical experienced human drivers (excluding teens / early 20s ages) are are involved in ~200-400 crashes per million miles. Google has 1 at fault accident in 2 million miles, so MUCH safer. However google cars are involved in ~600 crashes per million miles, far higher than typical average. Most of these are the google car getting rear-ended.

Theory is Google cars stop for things unexpectedly or abruptly, and get rear ended as a result.


It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
Not a "self driving car" but a Tesla on "Auto-Pilot".

https://www.bbc.com/news/technology-44439523


But the point is the same either way. Either you are driving or the computer is driving. A backup driver will never be able to maintain enough vigilance to react quickly for any length of time. The only good a backup driver does is what someone else said earlier. Someone to take over when the computers need a re-boot.

Autonomous vehicles will not be safe as long as they need a human backup. An autonomous vehicle should not have a steering wheel. The current testing phase is risky.

Share this post


Link to post
Share on other sites
gowlerk

Not a "self driving car" but a Tesla on "Auto-Pilot".
Autonomous vehicles will not be safe as long as they need a human backup. An autonomous vehicle should not have a steering wheel. The current testing phase is risky.



You are calling Tesla Autopilot a "testing phase", when it is an actual product.

The key question behind the one "should we even allow semi-autonomous cars" is even if their drivers can be inattentive, do the cars still perform more safely than human only drivers?

And I think we need to segregate the categories of "testing fully autonomous cars with a safety driver" from "selling semi-autonomous cars". There is no good way to full autonomous cars without some form of safety driver, and the track record of that testing has been really good. Waymo appears to be in the home stretch of developing its product, so hopefully in a few more years this debate will be moot.
It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
gowlerk

Not a "self driving car" but a Tesla on "Auto-Pilot".

https://www.bbc.com/news/technology-44439523


But the point is the same either way. Either you are driving or the computer is driving. A backup driver will never be able to maintain enough vigilance to react quickly for any length of time. The only good a backup driver does is what someone else said earlier. Someone to take over when the computers need a re-boot.

Autonomous vehicles will not be safe as long as they need a human backup. An autonomous vehicle should not have a steering wheel. The current testing phase is risky.


"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
gowlerk

Not a "self driving car" but a Tesla on "Auto-Pilot".

https://www.bbc.com/news/technology-44439523


But the point is the same either way. Either you are driving or the computer is driving. A backup driver will never be able to maintain enough vigilance to react quickly for any length of time. The only good a backup driver does is what someone else said earlier. Someone to take over when the computers need a re-boot.

Autonomous vehicles will not be safe as long as they need a human backup. An autonomous vehicle should not have a steering wheel. The current testing phase is risky.



This sounds very much like the insurance company laying the groundwork for law suit. There are urban legends of this happening but I've never heard of a real instance.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

0