0
gowlerk

Self driving Uber fatality

Recommended Posts

DJL

***I just saw the released video from Tempe PD. I believe it is highly unlikely that a human driver would have avoided that accident.



I'm with you on that. Not a chance. The would be a dead person any way you cut it.

The LA Times Article reminds me of when someone goes in. The guy had no aad and no handles pulled, no action taken. He just screwed around, lost track and impacted.It happens. "Cause of death: Neither parachute opened."

Share this post


Link to post
Share on other sites
Bob_Church

The LA Times Article reminds me of when someone goes in. The guy had no aad and no handles pulled, no action taken. He just screwed around, lost track and impacted.It happens. "Cause of death: Neither parachute opened."



I see the analogy a bit differently: In my analogy the parachutist was wearing an aad and had it turned on. Sure he didn't pull his handles, and yes he impacted on a 500ft hill, so the aad would not have saved him, but it still should have fired.
It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
SethInMI

***The LA Times Article reminds me of when someone goes in. The guy had no aad and no handles pulled, no action taken. He just screwed around, lost track and impacted.It happens. "Cause of death: Neither parachute opened."



I see the analogy a bit differently: In my analogy the parachutist was wearing an aad and had it turned on. Sure he didn't pull his handles, and yes he impacted on a 500ft hill, so the aad would not have saved him, but it still should have fired.

Ok. And you did watch the video? If so, then we're just seeing the same thing but taking away a very different impression.
Which is cool. Personally I don't see any way a machine moving at that speed had time to vary its course or speed enough to save the woman from herself. But that's just my impression.

Share this post


Link to post
Share on other sites
Bob_Church


Ok. And you did watch the video? If so, then we're just seeing the same thing but taking away a very different impression.
Which is cool. Personally I don't see any way a machine moving at that speed had time to vary its course or speed enough to save the woman from herself. But that's just my impression.



Yeah I watched it and I agree that the Uber car and Uber safety driver bear no responsibility in her death. That said, if this happened during the day, and you watched her on video cross into the 1st lane then enter the cars lane and get hit, would you not wonder why the car didn't even slow down when the woman entered the 1st lane?

Because from what Bill V and I are saying, the car's LIDAR sensor was not affected by the dark, so it should have had given the car some warning that something was coming.

In Michigan, we have to worry about deer crossing the road, or in the road. I want my autonomous car to (when it is practical and safe) to slow down if deer are heading into the road, not just smash into them like they are not even there. So the way things are at present, I am saying I want a Google Car not an Uber Car.
It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
SethInMI

***
Ok. And you did watch the video? If so, then we're just seeing the same thing but taking away a very different impression.
Which is cool. Personally I don't see any way a machine moving at that speed had time to vary its course or speed enough to save the woman from herself. But that's just my impression.



Yeah I watched it and I agree that the Uber car and Uber safety driver bear no responsibility in her death. That said, if this happened during the day, and you watched her on video cross into the 1st lane then enter the cars lane and get hit, would you not wonder why the car didn't even slow down when the woman entered the 1st lane?

Because from what Bill V and I are saying, the car's LIDAR sensor was not affected by the dark, so it should have had given the car some warning that something was coming.

In Michigan, we have to worry about deer crossing the road, or in the road. I want my autonomous car to (when it is practical and safe) to slow down if deer are heading into the road, not just smash into them like they are not even there. So the way things are at present, I am saying I want a Google Car not an Uber Car.

It would be nice to get a video of what the car was seeing and when. When would she have moved into the field of view and how far out does it work, that sort of thing. And how much time does it take working things out.
Ever since reading my first AI book many years ago I can't help but trying to analyze things when I'm standing at a corner or something. What would a computer have to scan in and how could it parse it enough to make sense of the situation. It is, to put it mildly, a lot.

Share this post


Link to post
Share on other sites
Bob_Church

******The LA Times Article reminds me of when someone goes in. The guy had no aad and no handles pulled, no action taken. He just screwed around, lost track and impacted.It happens. "Cause of death: Neither parachute opened."



I see the analogy a bit differently: In my analogy the parachutist was wearing an aad and had it turned on. Sure he didn't pull his handles, and yes he impacted on a 500ft hill, so the aad would not have saved him, but it still should have fired.

Ok. And you did watch the video? If so, then we're just seeing the same thing but taking away a very different impression.
Which is cool. Personally I don't see any way a machine moving at that speed had time to vary its course or speed enough to save the woman from herself. But that's just my impression.

She was walking in the dark patch too, extra hard to make out objects when the lighting changes like that.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
Quote

Uber safety driver bear no responsibility in her death.



I'm split on that because his job was to act as a back-up to developing technology, not to dick around on his phone and not watch the road. Granted, I said he probably couldn't have done anything but that doesn't change the job he was supposed to be doing.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
>It would be nice to get a video of what the car was seeing and when.

What you really want is the LIDAR constellation around the time of the crash. But even that's not all that useful to humans; often objects don't look like what you expect them to because the constellation is fairly "sparse" at distance. So algorithms have been developed to be able to turn that sparse LIDAR return into a recognizable object (pedestrian vs blowing trash, warning cones vs spray from the car ahead etc.)

The accident investigation will likely look at the effectiveness of that algorithm as well as the fidelity of the LIDAR data. My money would be on an algorithm problem due to the car's lack of problems before the crash.

Share this post


Link to post
Share on other sites
Phillbo

Ubers ability to test these cars in Arizona has been revoked.


Uber cars need intervention every 12 miles on average.
Google cars need human intervention every 5600 miles on average.

They are not doing something right. :S



I am not sure I agree with the revocation, hopefully it is temporary. I say this because the system concept of testing a automatic system with a human backup appeared to work as designed. The automatic system did not perform as expected, but it appears as if the safety driver (although not perfect) would not have been able to change the outcome.

What if this had happened to Google back in 2009 when it was getting started? Would that have been stopped safety driver testing for a month? a year? a decade?

The automatic car in test has to at least perform predictably enough so that the human monitor can step in, but it does not have to be perfect. The human monitor does need to pay attention and this maybe the biggest flaw in this style of testing. Who or what is monitoring the monitor?
It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
gowlerk


Uber fatalities so far

Careers of actual taxi drives
next
People who made money driving for Uber

I wonder how fervent all the arguments from Uber drivers defending scabbing Taxi drivers out of a job are now?

Share this post


Link to post
Share on other sites
from: https://www.theinformation.com/articles/uber-finds-deadly-accident-likely-caused-by-software-set-to-ignore-objects-on-road

"Uber has determined that the likely cause of a fatal collision involving one of its prototype self-driving cars in Arizona in March was a problem with the software that decides how the car should react to objects it detects, according to two people briefed about the matter.

The car’s sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber’s software decided it didn’t need to react right away. That’s a result of how the software was tuned. Like other autonomous vehicle systems, Uber’s software has the ability to ignore “false positives,” or objects in its path that wouldn’t actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company’s system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn’t react fast enough, one of these people said."

This makes sense to me. It was not the sensors, but the software.
It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
ryoder

***

Quote

Based on preliminary information, the car was going approximately 40 mph in a 35 mph zone, according to Tempe Police Detective Lily Duran.




So, the car that is supposed to be safer because there is no human controlling it is programed to speed?



I'm more inclined to believe the posted street signs, than what one cop said: https://www.google.com/maps/@33.4350531,-111.941492,3a,49.5y,347.01h,83.57t/data=!3m6!1e1!3m4!1sx-K4_17J8MVthFRapvIa2A!2e0!7i13312!8i6656

Speaking of cops not knowing traffic laws, this cop not knowing school zone speed limits was worth ten million for one of my patients.

It’s actually a pretty fascinating story.


https://www.alexanderlaw.com/blog/10-million-judgment

Share this post


Link to post
Share on other sites
Preliminary NTSB findings are here (4 pages)

Interesting reading.

The car saw her 6 seconds out. At 1.3 seconds it determined that emergency braking was required but
Quote

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior.



The car was travelling at 43 mph. According to this page (using its default deceleration rate) the car could have slowed to 17 mph which would have probably been survivable.

Rubbish for her but they should be able to develop the software to make this scenario survivable.

Share this post


Link to post
Share on other sites
Dazzle

Preliminary NTSB findings are here (4 pages)

Interesting reading.

The car saw her 6 seconds out. At 1.3 seconds it determined that emergency braking was required but

Quote

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior.



The car was travelling at 43 mph. According to this page (using its default deceleration rate) the car could have slowed to 17 mph which would have probably been survivable.

Rubbish for her but they should be able to develop the software to make this scenario survivable.



I think this shows that the car was still able, even with the shortcomings, to do as well or better than a human driver. I don't think a person would have been able to see the woman and brake fast enough to have changed the outcome considering the lighting at the scene and the normal assumptions of a driver that someone would not walk directly in front of a moving car.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
DJL

***Preliminary NTSB findings are here (4 pages)

Interesting reading.

The car saw her 6 seconds out. At 1.3 seconds it determined that emergency braking was required but

Quote

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior.



The car was travelling at 43 mph. According to this page (using its default deceleration rate) the car could have slowed to 17 mph which would have probably been survivable.

Rubbish for her but they should be able to develop the software to make this scenario survivable.



I think this shows that the car was still able, even with the shortcomings, to do as well or better than a human driver. I don't think a person would have been able to see the woman and brake fast enough to have changed the outcome considering the lighting at the scene and the normal assumptions of a driver that someone would not walk directly in front of a moving car.

There's serious talk, they may have even done it, about banning the cars from any public roads until they can be proven safe. The idea was to build simulations but I have to wonder if that's even possible. Can the levels of complexity in driving be created?
It still seems like the best idea is to test them with a driver ready to take over. I know, it's not fool proof, obviously, and the problem is that people will demand so much more than they're already putting up with.
I just with they'd hurry up. I want a car that wakes me up when we get there.

Share this post


Link to post
Share on other sites
Quote

There's serious talk, they may have even done it, about banning the cars from any public roads until they can be proven safe. The idea was to build simulations but I have to wonder if that's even possible. Can the levels of complexity in driving be created?
It still seems like the best idea is to test them with a driver ready to take over. I know, it's not fool proof, obviously, and the problem is that people will demand so much more than they're already putting up with.
I just with they'd hurry up. I want a car that wakes me up when we get there.



I have to imagine they're using the miles and miles of test track we have all over the country. They don't even need to put a person in them, just add extra fuel capacity and put about 100 of them out there only taking breaks to refuel.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
DJL

Quote

There's serious talk, they may have even done it, about banning the cars from any public roads until they can be proven safe. The idea was to build simulations but I have to wonder if that's even possible. Can the levels of complexity in driving be created?
It still seems like the best idea is to test them with a driver ready to take over. I know, it's not fool proof, obviously, and the problem is that people will demand so much more than they're already putting up with.
I just with they'd hurry up. I want a car that wakes me up when we get there.



I have to imagine they're using the miles and miles of test track we have all over the country. They don't even need to put a person in them, just add extra fuel capacity and put about 100 of them out there only taking breaks to refuel.



That's a good way to test for endurance but I don't think that's the problem.

Share this post


Link to post
Share on other sites
>There's serious talk, they may have even done it, about banning the cars from any public roads
>until they can be proven safe.

They will never be safe. The question will be - how safe are they compared to a human driver? If they are safer than your average driver then they will reduce fatalities, even if they're not perfect.

>The idea was to build simulations but I have to wonder if that's even possible. Can the levels of
>complexity in driving be created?

Recreating the complexity is easy; there are hundreds of thousands of cars on the road with cameras and radar, and tens of thousands with lidar. That means we are getting terabytes of data about the complexity of the environment. (And real situations are better than simulated for purposes of testing.)

What's hard is accurately simulating the sensor input. For example, you can take the above data and recreate any number of situations that the car has observed (woman darting into traffic with bike, kid running after a ball etc.) But what will the lidar return look like once they switch to the new awesome lidar from Velodyne? That's what will be hard to simulate.

Share this post


Link to post
Share on other sites
Bob_Church

***

Quote

There's serious talk, they may have even done it, about banning the cars from any public roads until they can be proven safe. The idea was to build simulations but I have to wonder if that's even possible. Can the levels of complexity in driving be created?
It still seems like the best idea is to test them with a driver ready to take over. I know, it's not fool proof, obviously, and the problem is that people will demand so much more than they're already putting up with.
I just with they'd hurry up. I want a car that wakes me up when we get there.



I have to imagine they're using the miles and miles of test track we have all over the country. They don't even need to put a person in them, just add extra fuel capacity and put about 100 of them out there only taking breaks to refuel.



That's a good way to test for endurance but I don't think that's the problem.

You obviously have not watched an adequate quantity of Myth Busters. I'm picturing an elaborate course of crosswalks, trash bags blowing around, obstacles rising and falling, feral pigs and even a quantity of live drivers in crash-up derby cars.
"I encourage all awesome dangerous behavior." - Jeffro Fincher

Share this post


Link to post
Share on other sites
It is an interesting problem. Can one devise a pass / fail test to certify that an autonomous car has met some safety standard, so is no more dangerous than some reference?

The problem is you want the test to be repeatable and representative, so that all cars are subject to the same test conditions, but you don't want those conditions to always be the same, or the "teaching the test" problem occurs.

Of course that would be far easier if the environment could be a simulated one. As Bill pointed out, there is work do generate the sensor data. For camera based systems, images of the virtual environment would need to be generated (looking just like a CGI movie). For lidar systems, point clouds would be created instead. For radar, radar data. etc.
It's flare not flair, brakes not breaks, bridle not bridal, "could NOT care less" not "could care less".

Share this post


Link to post
Share on other sites
"You obviously have not watched an adequate quantity of Myth Busters. I'm picturing an elaborate course of crosswalks, trash bags blowing around, obstacles rising and falling, feral pigs and even a quantity of live drivers in crash-up derby cars. "

I admit I don't watch Mythbusters, but I'd pay to watch this.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

0