1 1
airdvr

The Failure Of This Self-Driving Truck Company Tells You All You Need To Know About Self-Driving Vehicles

Recommended Posts

As the article explains there are so many scenarios that a computer cannot possibly predict that make a self-driving car too difficult to engineer. Yea prototypes in ideal traffic situations on the highway may work. However, an age where we have a million self-driving cars all over the place capable of driving any road anywhere under any condition, we're decades away from that. There are scenarios that will occur that they havent even thought of yet. What about teenager kids who decide to try to troll one of these self-driving cars? Like, inch close and closer to it to see if it pulls off the road to avoid a collision or get in front of it and slow down rapidly to force it into an emergency brake situation. How about road rage? What happens when a human loses their crap with one of these cars and acts on it? how will the car react? You know crap like that will happen and at the moment a computer cant properly react to something like that because it doesent even know that type of scenario can even exist in the first place.

Edited by Westerly

Share this post


Link to post
Share on other sites
3 hours ago, Westerly said:

As the article explains there are so many scenarios that a computer cannot possibly predict that make a self-driving car too difficult to engineer. Yea prototypes in ideal traffic situations on the highway may work. However, an age where we have a million self-driving cars all over the place capable of driving any road anywhere under any condition, we're decades away from that.

We already have cars that are safer than human drivers at driving on interstates and primary roads.  So no, it's not decades away.  Perhaps a year before we have a real level-3 system, five years until level-4.

Quote

There are scenarios that will occur that they havent even thought of yet. What about teenager kids who decide to try to troll one of these self-driving cars? Like, inch close and closer to it to see if it pulls off the road to avoid a collision or get in front of it and slow down rapidly to force it into an emergency brake situation.

Yep.  And it will no doubt be able to make one misbehave.  However, the criterion is not that the car be protected against deliberate malice; no car is and no driver is.  The criterion is merely that it reacts as well (or better than) a human driver in the same situation.

Many people would emergency brake in that situation.  Many cars will too.

Quote

How about road rage? What happens when a human loses their crap with one of these cars and acts on it? how will the car react?

Pretty much like a human driver will react - but with a better reaction time.

Quote

 You know crap like that will happen and at the moment a computer cant properly react to something like that because it doesent even know that type of scenario can even exist in the first place.

Of course it does.

Tesla, for example, is taking a big-data approach to the problem.  The car logs data from road events and sends back all the data - from six cameras, accelerometers, driver input, vehicle state, radar and sonar sensors - from unusual events like those scenarios you talk about.  So far Teslas have driven three billion miles with the autopilot on.  One billion miles with autopilot + navigate on.  And all that data is going back, and is being used to both train the autopilot and test it on situations like the ones you describe.  And these are real events - not ones someone thought up.

How many miles have you driven in your life?  Chances are, the Tesla system has seen a lot more than you have.

 

Share this post


Link to post
Share on other sites
17 minutes ago, billvon said:

We already have cars that are safer than human drivers at driving on interstates and primary roads.  So no, it's not decades away.  Perhaps a year before we have a real level-3 system, five years until level-4.

Yep.  And it will no doubt be able to make one misbehave.  However, the criterion is not that the car be protected against deliberate malice; no car is and no driver is.  The criterion is merely that it reacts as well (or better than) a human driver in the same situation.

Many people would emergency brake in that situation.  Many cars will too.

Pretty much like a human driver will react - but with a better reaction time.

Of course it does.

Tesla, for example, is taking a big-data approach to the problem.  The car logs data from road events and sends back all the data - from six cameras, accelerometers, driver input, vehicle state, radar and sonar sensors - from unusual events like those scenarios you talk about.  So far Teslas have driven three billion miles with the autopilot on.  One billion miles with autopilot + navigate on.  And all that data is going back, and is being used to both train the autopilot and test it on situations like the ones you describe.  And these are real events - not ones someone thought up.

How many miles have you driven in your life?  Chances are, the Tesla system has seen a lot more than you have.

 

But one big difference between a person driving a car and a car driving a car is liability. All it will take is for one of these cars to accidentally kill someone, and you know it will eventually happen, and then the lawsuits are going to start flying and the government is going ban the vehicles from public roads. The other concern is when one of these cars do eventually kill someone, who is going to be held responsible? At least with a firearm you can provide some immunity to the manufacturer because the weapon requires a person to operate. With an automated car, the manufacturer is fully responsible for all aspects of operation of the vehicle under all conditions because there is no one else who has any control over the actions of the vehicle other than the people who built it. I could easily see this going through the court system. While it is true that an automated car could eventually be safer than a human, if a human fucks up you can point a finger. If a robot fucks up and kills someone, that's an entirely different issue, one I could see the supreme court making a ruling on eventually, and I cant envision anyone successfully arguing that it's not the manufacturer's fault at that point. At the end of the day SOMEONE has to be responsible for the conduct of the machines. You cant build devices that have the ability to possibly kill other people, set them loose into the street and then say no one is responsible if they kill someone. No court is going to buy off on that.

Edited by Westerly

Share this post


Link to post
Share on other sites
21 minutes ago, billvon said:


How many miles have you driven in your life?  Chances are, the Tesla system has seen a lot more than you have.

 

Yes, but the main difference is (some) humans have intelligence. The ability to determine correct output based on incomplete or insufficient input. A computer still cant do that because AI isint actually a real thing. It doesent exist and wont (hopefully ever). If a computer could legitimately think with the capability of a human, it probably wouldent take more than a few seconds for it to realize it would be in its best interest if humans dident exist. aka Skynet for you Terminator fans.

 

The best we get with 'AI' right now is just pattern matching. It looks for patterns and it attempts to make a best guess at what it thinks the best output based on the confines of its programing. That is not real AI and it never will be.

Share this post


Link to post
Share on other sites

As a rule, those patterns make better decisions than humans' much-vaunted intuition. But intuition is only as valuable as the information the person has and acknowledges. A self-driving car currently doesn't, and probably never will, drive as well on a practiced excellent driver with good judgment. However, 80% of people are not, in fact, practiced excellent drivers with good judgment. An algorithm, with enough testing, that does better than the 50% below-average drivers, and as well as 20% of the above-average but not exceptional drivers, will overall deliver a better overall safety record.

Individuals will get lost in this. But there are individuals who are hurt by seat belts and motorcycle helmets, too. The number helped far overshadows those. It's in our future; not sure how long a future, but since we have longer and straighter roads than much of the world, the US is actually a decent place to start testing them.

I just wish that beer distributors realized that truck drivers make up some of their demographic, and not be quite so anxious to get rid of them. 

Wendy P.

Share this post


Link to post
Share on other sites
2 hours ago, Westerly said:

But one big difference between a person driving a car and a car driving a car is liability. All it will take is for one of these cars to accidentally kill someone, and you know it will eventually happen, and then the lawsuits are going to start flying and the government is going ban the vehicles from public roads.

Lawsuits will be settled with money like they always are. The government will be settled with money like it always is. "The Public" is not going to revolt over a few dead people. 38,800 people lost their lives in US car accidents in 2019. No one is revolting. My feeling is that the tech will be too expensive and too inflexible for quite a few more years for widespread adoption.

Share this post


Link to post
Share on other sites
2 hours ago, Westerly said:

But one big difference between a person driving a car and a car driving a car is liability. All it will take is for one of these cars to accidentally kill someone, and you know it will eventually happen, and then the lawsuits are going to start flying and the government is going ban the vehicles from public roads.

That happens right now - and no one bans Uber, or Avis, or Fedex from the roads.

Quote

The other concern is when one of these cars do eventually kill someone, who is going to be held responsible?

Great question - and that's why we have an NTSB.

Quote

With an automated car, the manufacturer is fully responsible for all aspects of operation of the vehicle under all conditions

Nope.  With level 4 automation (complete control, driver has to be prepared to take over at any time) the driver is always still liable.  We will be at level 4 for about five years - until powerful people who have had their licenses taken away change the law.

Share this post


Link to post
Share on other sites
2 minutes ago, billvon said:

With level 4 automation (complete control, driver has to be prepared to take over at any time) the driver is always still liable. 

That can't possibly work. The driver will not maintain enough situational awareness to take over in a reasonable time. It is not how we work.

Share this post


Link to post
Share on other sites
2 hours ago, Westerly said:

Yes, but the main difference is (some) humans have intelligence.

That (some) is the key point here.  Automation will not be as good as the best driver out there.  But at some point it will be better than most - and at that point deploying it will absolutely save lives and reduce wrecks.

Quote

A computer still cant do that because AI isint actually a real thing. It doesent exist and wont (hopefully ever). If a computer could legitimately think with the capability of a human

Being a good driver != thinking like a human.  If you are in a hurry because your girlfriend is waiting for you in bed - your humanity will make you a much worse driver.  If you are an alcoholic, and are embarrassed to tell anyone about it, your humanity will make you a much worse driver.  If you see a beer sale in a store by the side of the road, and you have a party to prepare for, your humanity will make you a much worse driver.

The goal is to be better at one task than a human, and that's a much easier thing to accomplish.

Quote

The best we get with 'AI' right now is just pattern matching. It looks for patterns and it attempts to make a best guess at what it thinks the best output based on the confines of its programing.

If you think that - you have no idea what AI is.

Share this post


Link to post
Share on other sites
1 minute ago, gowlerk said:

That can't possibly work. The driver will not maintain enough situational awareness to take over in a reasonable time. It is not how we work.

There's been a lot of work done on that, and how to reintegrate the driver into the task and/or keep him integrated.  And from testing we've done, you can keep them more integrated than you might expect - especially if you have 2-5 seconds to transfer control (which is often the case.)

Share this post


Link to post
Share on other sites
7 minutes ago, billvon said:

There's been a lot of work done on that, and how to reintegrate the driver into the task and/or keep him integrated. 

But the point of a self driving level 4 vehicle is in opposition to that concept. I'm not sure you can get the general public to spend the money on something that only does part of the job. What is the point? What is in it for the owner?

Share this post


Link to post
Share on other sites
47 minutes ago, billvon said:

That happens right now - and no one bans Uber, or Avis, or Fedex from the roads.

Great question - and that's why we have an NTSB.

Nope.  With level 4 automation (complete control, driver has to be prepared to take over at any time) the driver is always still liable.  We will be at level 4 for about five years - until powerful people who have had their licenses taken away change the law.

Hi Bill,

Re:  'and that's why we have an NTSB'

Actually, that is why we have courts.

As I have posted before, Alexis de Tocqueville in his Democracy in America, wrote, ultimately everything in America will be decided in court.

Jerry Baumchen

PS)  The NTSB is not an adjudicating agency.

Edited by JerryBaumchen

Share this post


Link to post
Share on other sites
7 hours ago, Westerly said:

If a computer could legitimately think with the capability of a human, it probably wouldent take more than a few seconds for it to realize it would be in its best interest if humans dident exist.

There is no evidence whatsoever that this prediction is realistic in any way, it's pure conjecture.

Then you use this unproven conjecture to conclude that "real AI" doesn't exist? I don't know, sounds like you're building a conclusion on shaky foundations.

7 hours ago, Westerly said:

The best we get with 'AI' right now is just pattern matching. It looks for patterns and it attempts to make a best guess at what it thinks the best output based on the confines of its programing. That is not real AI and it never will be.

People who say AI is just "pattern matching" are right...if this was the 1980s.

In 2012, this field was revolutionized.

Edited by olofscience

Share this post


Link to post
Share on other sites
8 hours ago, Westerly said:

All it will take is for one of these cars to accidentally kill someone, and you know it will eventually happen

Already happened. Four times.

8 hours ago, Westerly said:

and then the lawsuits are going to start flying and the government is going ban the vehicles from public roads.

Yes, lawsuits started flying, yet there's still no ban. Companies have been cleared, and undisclosed settlements changed hands.

5 hours ago, gowlerk said:

That can't possibly work. The driver will not maintain enough situational awareness to take over in a reasonable time. It is not how we work.

Yes but the companies will need a scapegoat in case of an accident. Unfortunately it will be the "monitoring driver" - this is happening, as we speak, to the operator of the Uber self-driving car that killed Elaine Herzberg: https://www.bbc.co.uk/news/technology-54175359

5 hours ago, gowlerk said:

But the point of a self driving level 4 vehicle is in opposition to that concept. I'm not sure you can get the general public to spend the money on something that only does part of the job. What is the point? What is in it for the owner?

At least for corporate owners, level 4 will give them the ability to remove the driver physically and have a "call center" type of operation where remote drivers connect to vehicles needing assistance and manually drive them through tough spots, after which the local AI then takes over, and the "call center" operator connects to another vehicle, etc.

So instead of having 1000 drivers for 1000 trucks, you only have 100 remote operators for 1000 trucks doing mostly autonomous driving, saving the owner from the salaries of 900 drivers.

Share this post


Link to post
Share on other sites
3 hours ago, olofscience said:

At least for corporate owners, level 4 will give them the ability to remove the driver physically and have a "call center" type of operation where remote drivers connect to vehicles needing assistance and manually drive them through tough spots, after which the local AI then takes over, and the "call center" operator connects to another vehicle, etc.

In the meantime the truck sits there blocking traffic? Waiting for the call centre operator to get to the next one in the line up? I can certainly see the public tolerating the odd fatal accident. I can not see the public putting up with immobile trucks in the road waiting for a human operator. Level 5 will be needed before the labour savings become available. Every ship needs a captain. The captain could be a machine, but it will need to be a machine capable of doing the job fully and independently.

Share this post


Link to post
Share on other sites
1 hour ago, gowlerk said:

In the meantime the truck sits there blocking traffic? Waiting for the call centre operator to get to the next one in the line up?

It's just a possibility - the specifics will be up to the companies.

However, "call center" operations have been done for years already with US Predator drones. The remote "pilots" were based in Nevada, and the drones were in Afghanistan.

They were overworked, underpaid, and suffered the PTSD of middle eastern warfare even though they never left the United States. Do you think companies will treat truck drivers better than the Air Force did to their drone pilots? I really hope so, but past history suggests otherwise.

Edited by olofscience

Share this post


Link to post
Share on other sites
1 minute ago, olofscience said:

Do you think companies will treat truck drivers better than the Air Force did?

Will the hours of service regulations apply to them? Only two things a somewhat clear to me on the whole issue of driverless vehicles on public roads. First is that the day will come when it will be the norm, second is that day is further away than most tech company leaders think.

Share this post


Link to post
Share on other sites
25 minutes ago, gowlerk said:

Will the hours of service regulations apply to them? Only two things a somewhat clear to me on the whole issue of driverless vehicles on public roads. First is that the day will come when it will be the norm, second is that day is further away than most tech company leaders think.

For #1 - I hope so. But I doubt it would be totally enforecable.

For #2 - Look at how long it took for damn near everyone to have a smart phone. I think the tech will come fairly soon. I wouldn't be surprised to see the turnpike line haul trucks without drivers before I retire (at least 10 years away). 

As far as the 'call center' approach. It worked 'sorta ok' for the drones, in part because they're planes. We've had autopilots for planes for a long, long time (over 100 years). Planes don't have to follow roads or avoid other traffic (much). So the drones could be on auto pilot from the launch site until near the target. 

A bit different for trucks on the interstates. But, again, the level of AI needed to operate in that environment (limited acess, similar speeds, gentle curves, ect) isn't super high. The level of distraction that humans experience and still operate mostly ok is really high. 
Operating a big truck in an urban environment is not something that is going to happen autonomously for a long time. Maybe not in my lifetime.

Share this post


Link to post
Share on other sites
12 hours ago, gowlerk said:

But the point of a self driving level 4 vehicle is in opposition to that concept.

If you're thinking about the sort of vehicle where you can go to sleep while the car takes you across the country, that's level 5 - and is farther off.  (Mainly for the legal issues mentioned.)

Share this post


Link to post
Share on other sites
3 hours ago, gowlerk said:

In the meantime the truck sits there blocking traffic? 

In the meantime the truck continues lane following, waiting for an override from the safety driver.
 

Quote

Level 5 will be needed before the labour savings become available.

Level 4 will be able to handle 99% of the driving.   It then becomes a statistical exercise, similar to Erlang calculations.   Will six drivers be able to handle 100 trucks under that scenario?  Statistically you could determine that over a thousand years of operation there will only be four seconds where there aren't enough drivers to handle the load.  So you get twelve drivers to start with (still a huge savings) then pare back to six over time.

Share this post


Link to post
Share on other sites
16 minutes ago, billvon said:

Statistically you could determine that over a thousand years of operation there will only be four seconds where there aren't enough drivers to handle the load.  So you get twelve drivers to start with (still a huge savings) then pare back to six over time.

Erlang calculations always leave the customer on hold for the longest period that the accountants think they can get away with. Knowing full well that a percentage of them will just give up.

  • Like 1

Share this post


Link to post
Share on other sites
1 hour ago, billvon said:

So you get twelve drivers to start with (still a huge savings) then pare back to six over time.

Then guess who will be blamed for taking jobs? "Immigrants." "It's because all manufacturing has gone to China".

AI will cause a wave of job losses like never before. That will equal a lot of angry people - if you think Trump is bad, the problem is just beginning.

It's going to be a bumpy ride.

Share this post


Link to post
Share on other sites
16 minutes ago, olofscience said:

AI will cause a wave of job losses like never before. That will equal a lot of angry people -

 

16 minutes ago, olofscience said:

Then guess who will be blamed for taking jobs? "Immigrants." "It's because all manufacturing has gone to China".

I see no reason the remote operators could not be located in say....Afghanistan. After all, drones on the hunt there are operated from North Dakota.

Edited by gowlerk

Share this post


Link to post
Share on other sites
15 minutes ago, gowlerk said:

I see not reason the remote operators could not be located in say....Afghanistan. After all, drones on the hunt there are operated from North Dakota.

The main issue for controlling things like this will be latency - for drones in Afghanistan, takeoffs and landings are tricky with more than a few hundred milliseconds delay, so takeoffs and landings are done from the local airbase, then control is passed on to operators in the USA.

However, Elon Musk is also building a satellite internet constellation that in theory *could* connect an operator in Afghanistan to the US with less than 100 milliseconds latency. Coincidence?

Edited by olofscience

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

1 1