Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

When a driverless car can't handle the situation

This site may earn commission on affiliate links.
- Single lane roads where there is often communication between drivers with hand gestures or flashing light to communicate who goes first
Imagine when the autonomous cars can communicate, then one will simply wait at the lay-by for the other to pass.

I'm wondering when the first hit-and-run happens with a driverless car or truck
If the car is aware it hit someone, then it won't run, but simply inform the authorities, owner, and manufacturer, and then shutdown.

Accidents will happen, autonomous does not mean perfect, only better than the alternative.
 
Doesn't everyone think The Tesla Crew will search out all these little situations when driving and continually improve the system!?!
What about an ambulance or police driving behind you and you know it but the system doesn't understand pulling over.
Various situations in various settings, pretty amazing we have the AP we have now after just a few years of work!
 
Doesn't everyone think The Tesla Crew will search out all these little situations when driving and continually improve the system!?!
What about an ambulance or police driving behind you and you know it but the system doesn't understand pulling over.
Various situations in various settings, pretty amazing we have the AP we have now after just a few years of work!

Absolutely not... Tesla doesn't have to search out anything unlike other manufacturers. Instead these situations are crowdsourced. Tesla has been collecting information about these situations since AP1 was released.

The system will understand emergency vehicles, I am sure of it.
 
A few questions you can ask yourself is:

  • "Has any human Tesla driver ever driven a Tesla equipped with autopilot in any one of these situations?"
  • "How many additional Tesla drivers with the new AP2 hardware will also encounter these same situations before public rollout?"
  • "Did these humans all crash and die or did they provide valuable training information in how future neural network should behave in these situations?
Not that simple. The AI simply can't learn when context is not determined and cause-effect relationships can not be associated, That level of sophistication is way beyond where we'll be for many years to come! That's why "strategic" aspects of driving were left out of the autonomous driving standards criteria and the focus was placed on the much more limited "tactical" which is within the scope of the AI's adaptive ability.
 
If you are interested in the state of the research involved with AI and autonomous driving, watch these videos. It is an hour well spent. These are split into 9 videos of the NVIDEA presentation at the Consumer Electronics Show in Las Vegas early this year. Many believe that this is the direction Tesla will go with their systems. Gets pretty techy at times but even this feeble mind was able to digest most of it! ;)

Many of the concerns stated earlier in this thread are discussed in this presentation.


Dan
 
The AI simply can't learn when context is not determined and cause-effect relationships can not be associated,
That's just it, context is the current sensor data, it's always determined unless the sensors are not working properly.

I cannot personally think of anything on the road that I reacted to without a reason... The cause and effect relationship was determined through my sensors (eyes) and I reacted. If hardware is monitoring my actions then it saw what saw (using cameras, radar, possibly ultrasonics) and will associate that data with the reaction that I have provided to the system. I'm basically labeling a data point for it to learn from.

There are a great deal of situations never even encountered by a human where the google car did a fine job.

This is an except talk from Chris Urmson if you haven't seen it already:
some interesting iteractions start at 24:53
 
Last edited:
If you are interested in the state of the research involved with AI and autonomous driving, watch these videos. It is an hour well spent. These are split into 9 videos of the NVIDEA presentation at the Consumer Electronics Show in Las Vegas early this year. Many believe that this is the direction Tesla will go with their systems. Gets pretty techy at times but even this feeble mind was able to digest most of it! ;)

Many of the concerns stated earlier in this thread are discussed in this presentation.


Dan
Those videos were awesome and give great insight into where Tesla is headed.
 
My concern is that, when AP2 is fully baked and rolled-out, there will be confusion in the used car market at some point. People will buy a Tesla and assume it's fully self-driving but it might be an AP1 car. There might eventually need to be a more urgent warning than just "keep you hands on the wheel".
 
My concern is that, when AP2 is fully baked and rolled-out, there will be confusion in the used car market at some point. People will buy a Tesla and assume it's fully self-driving but it might be an AP1 car. There might eventually need to be a more urgent warning than just "keep you hands on the wheel".
They could always ask... that's like buying a car and assuming it's a Tesla because it has four wheels. (ok that's a bit extreme)

They can clearly see the AP2 hardware (so they know if it's one of the newer vehicles). If the seller says it has AP2 enabled then it will and if not then it won't. If the seller lies, then that's fraud.
 
I think people will do their research on any car purchase, just like they do now. I mean, you don't but a used Hyundai without knowing if it is a 2015 model or a 2013 model and the features unique to that model year...do you? I wouldn't think so at least.

Dan
One would think... but do you remember all those threads with people complaining that they just bought AP1 cars when AP2 was announced even though we all knew for months AP2 was coming before the end of the year. Not everyone does enough research haha :)
 
One would think... but do you remember all those threads with people complaining that they just bought AP1 cars when AP2 was announced even though we all knew for months AP2 was coming before the end of the year. Not everyone does enough research haha :)
At some point we have to man up and take some responsibilities for our own actions I guess (heaven forbid!)

There is certainly no intent to defraud. All the information is readily out there for one and all.

Dan
 
  • Like
Reactions: JeffK
That's just it, context is the current sensor data, it's always determined unless the sensors are not working properly.
The point is that some situations require actual reasoning to make sense of complex situations rather than trivial tactical reactions such as those. Avoiding hitting something is brain dead AI, and can be taken for granted. Interpreting gestures, seeing atpyical pedestrian traffic or street construction several blocks ahead and planning a detour is strategic and requires context identification. The context, of course, is NOT simply the "sensor data" - it is interpreted sensor data which of course, isn't automatically done, regardless of millions of miles of "deep learning". These are much higher level strategic actions,. There is no cause and effect being established, because the AI was never programmed to understand that level of driving ability.
 
The point is that some situations require actual reasoning to make sense of complex situations rather than trivial tactical reactions such as those. Avoiding hitting something is brain dead AI, and can be taken for granted. Interpreting gestures, seeing atpyical pedestrian traffic or street construction several blocks ahead and planning a detour is strategic and requires context identification. The context, of course, is NOT simply the "sensor data" - it is interpreted sensor data which of course, isn't automatically done, regardless of millions of miles of "deep learning". These are much higher level strategic actions,. There is no cause and effect being established, because the AI was never programmed to understand that level of driving ability.
I would very respectfully disagree with you here. Based on the NVIDEA videos I have seen, the situations you describe are exactly what they are now learning to interpret through image recognition and neural networking. It is nothing different than what we do now as humans. We interpret visual signals like hand gestures, vehicle lighting to recognize emergency vehicles, uniforms on officers, etc and compare that to the current driving environment to determine the appropriate action to take. A computer will learn to recognize all of this and will be able to respond appropriately, more accurately and faster than we as humans would. Remember, the most recent NVIDEA videos out there on Youtube are from January 2016...11 months ago. How far have they come since then?

Will it be perfect in every imaginable situation, especially at first? Of course not. Remember however, they do not have to demonstrate perfection, just the ability to be significantly more safe and accurate than a human would be under the same circumstance. I have no doubt that this will happen and a lot sooner than we all think.

As always, just my 2 cents.

Dan
 
"NVIDIA"

Interpreting gestures, seeing atpyical pedestrian traffic or street construction several blocks ahead and planning a detour is strategic and requires context identification. The context, of course, is NOT simply the "sensor data" - it is interpreted sensor data which of course, isn't automatically done, regardless of millions of miles of "deep learning".

The google car can interpret gestures of a bicyclist. A connected vehicle even today can plan as strategically as routes on google maps without even having to be in visual distance. It can avoid both routes with accidents and known construction zones if it really wanted. Google maps is great at detours. Nvidia and Google have demonstrated cars navigating cones. Reading a sign saying slow or stop is trivial and the hand gestures typically used are pretty standard.

figvi-04.jpg


Even atypical pedestrian and car traffic won't matter... see the Chris Urmson SXSW video where they show atypical pedestrian behavior with the Google car and it handled it just fine. He also demonstrated atypical vehicle traffic with multiple cars going the wrong way down a one way street perpendicular to the car.

It doesn't have to be terribly advanced to be lvl 5. You might be giving humans too much credit :)
 
NVIDIA from jan 2017


Skip to 7:58

Nvidia's own BB8 autopilot demo video at 5:26

Interesting comparing the shown small board being used versus the drive PX2 in the videos from CES 2016. They didn't state whether they had the same computational power. What the PX2 can do is nothing short of amazing. And the 7th(?) short video of 10 where they fused all the data while highway driving and colored all the other vehicles convinced me that controlled highway driving is solved. That was a year ago, though they do use lidars.

In a couple years every automaker will either have duplicated this on their own, or will be buying the tech from someone else.

It still is amazing that no one has been able to duplicate AP1 in a production car. Tesla has a big lead.

RT