Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Government regulations of L2 driver assist systems

This site may earn commission on affiliate links.
In which case its the driver who is at fault. I think the Tesla manual, and the warning screens that you have to click on when enabling AP, make it very clear that AP is level 2. The nag system consistently reminds you of this. Every time you use AP. Yet this guy played a game on his phone. How anyone can say the driver isnt to blame here is not clear to me.

Anyway, beating a dead horse and all that.
Human error is responsible for over 90% of accidents. Saying that we shouldn't take action to prevent those accidents doesn't make sense to me.
 
Human error is responsible for over 90% of accidents. Saying that we shouldn't take action to prevent those accidents doesn't make sense to me.

And I'm not saying that at all. We need to be careful to distinguish between providing assist mechanisms to try to stop/discourage bad drivers doing bad things, and replacement mechanisms that take over (legal) responsibility for preventing these bad things. I'm all for the former, but the latter carries with it a huge risk of liability for the car maker. And, indirectly, us all, since we all pay one way or another for that.

Some here have argued that a nag system (such as tugging the wheel) implies that the car is taking full responsibility for ensuring that the driver is paying attention at all times. This would imply that any accident caused by driver inattention is the fault of the car, not the driver. In the US at least there is already a culture of shifting blame ("The car SHOULD have stopped me from playing a game .. its not my fault!"), and this view tends to feed into this culture, with all that it implies.

So, if you view a nag system as "Reminder: YOU, the driver, should be paying attention, because you are still responsible for the safety of the car", then that's fine. But if you view the nag system as "I'm here to make sure you are watching the road, don't worry, I've got you covered" then that's another matter entirely. And not good for any of us, since it encourages recklessness and accidents.
 
And I'm not saying that at all. We need to be careful to distinguish between providing assist mechanisms to try to stop/discourage bad drivers doing bad things, and replacement mechanisms that take over (legal) responsibility for preventing these bad things. I'm all for the former, but the latter carries with it a huge risk of liability for the car maker. And, indirectly, us all, since we all pay one way or another for that.

Some here have argued that a nag system (such as tugging the wheel) implies that the car is taking full responsibility for ensuring that the driver is paying attention at all times. This would imply that any accident caused by driver inattention is the fault of the car, not the driver. In the US at least there is already a culture of shifting blame ("The car SHOULD have stopped me from playing a game .. its not my fault!"), and this view tends to feed into this culture, with all that it implies.

So, if you view a nag system as "Reminder: YOU, the driver, should be paying attention, because you are still responsible for the safety of the car", then that's fine. But if you view the nag system as "I'm here to make sure you are watching the road, don't worry, I've got you covered" then that's another matter entirely. And not good for any of us, since it encourages recklessness and accidents.
Obviously we shouldn't put systems in cars that make them less safe (by less safe I mean real world safety not theoretical "perfect use" safety). I've never heard the theory that driver attentiveness monitoring could make people take more risks. The good news is it's a very easy theory to test!
I wonder why Tesla has increased the frequency of nags over time? Are they trying to cause more accidents?
 
Obviously we shouldn't put systems in cars that make them less safe (by less safe I mean real world safety not theoretical "perfect use" safety). I've never heard the theory that driver attentiveness monitoring could make people take more risks. The good news is it's a very easy theory to test!
I wonder why Tesla has increased the frequency of nags over time? Are they trying to cause more accidents?

Actually that was not my main point at all, but to highlight the difference between a system reminding you that you are responsible vs one that takes responsibility. One shifts liability, the other does not.

However, I can't tell you the number of times, back when ABS was new, I heard people saying "ABS lets you driver faster on wet and snowy roads". (facepalms)
 
Actually that was not my main point at all, but to highlight the difference between a system reminding you that you are responsible vs one that takes responsibility. One shifts liability, the other does not.

However, I can't tell you the number of times, back when ABS was new, I heard people saying "ABS lets you driver faster on wet and snowy roads". (facepalms)
I'm just worried about whether or not driver attention monitoring improves safety. I have confidence that any liability issues will be sorted out as they were for all the other mandated safety features in cars.
ABS is a perfect example of a potential problem with L2 driver assistance features, risk compensation, people can use them as an excuse to not pay attention to the road.
 
I'm just worried about whether or not driver attention monitoring improves safety. I have confidence that any liability issues will be sorted out as they were for all the other mandated safety features in cars.
ABS is a perfect example of a potential problem with L2 driver assistance features, risk compensation, people can use them as an excuse to not pay attention to the road.

Agreed. My fear is that people think they ONLY have to pay attention when nagged, and if they crash between nags the excuse will be "The car should have warned me!". We've had two death-causing Tesla crashes, in both cases the drivers were ignoring road conditions (one watching a DVD the other playing a game). Presumably they thought "the car will remind me to look up from time to time".

Several people have argued we need better nag systems. I can't argue against any safety system improvement, but no system, however good, will overcome human stupidity at its finest. What worries me are scenarios where people legislate systems that attempt to allow for stupidity but also mess up the driving experience and pleasure for everyone else.
 
Agreed. My fear is that people think they ONLY have to pay attention when nagged, and if they crash between nags the excuse will be "The car should have warned me!". We've had two death-causing Tesla crashes, in both cases the drivers were ignoring road conditions (one watching a DVD the other playing a game). Presumably they thought "the car will remind me to look up from time to time".

Several people have argued we need better nag systems. I can't argue against any safety system improvement, but no system, however good, will overcome human stupidity at its finest. What worries me are scenarios where people legislate systems that attempt to allow for stupidity but also mess up the driving experience and pleasure for everyone else.
There have been 5 known driver fatalities while using Autopilot. There were also recently two fatalities where someone driving a Tesla ran a red light at the end of a freeway (the NHTSA is investigating whether or not the driver was using Autopilot).
I think it's more likely the drivers were thinking "My car has driven itself a hundred times on this section of road with no problem. I'll just rest my hand on the wheel and look at my phone."
I think if we can make computer vision almost good enough to drive a car we can make computer vision good enough to make sure the driver isn't looking at their phone.
 
I think if we can make computer vision almost good enough to drive a car we can make computer vision good enough to make sure the driver isn't looking at their phone.
Or you can focus your energy on making computer vision simply "good enough to drive a car" without wasting effort on other vision that becomes obsolete when you reach that milestone anyway?

@diplomat33 :p
 
Or you can focus your energy on making computer vision simply "good enough to drive a car" without wasting effort on other vision that becomes obsolete when you reach that milestone anyway?

@diplomat33 :p
There's a lot more to driving a car than computer vision. Also, to drive a car you need computer vision good enough to recognize what pedestrians are doing so there would be minimal wasted effort.
 
Or you can focus your energy on making computer vision simply "good enough to drive a car" without wasting effort on other vision that becomes obsolete when you reach that milestone anyway?

@diplomat33 :p

Lidar is not wasted effort. it is a vital tool needed to do safe autonomous driving. What would be a wasted effort would be correcting everything wrong in your post. :p
 
Lidar is not wasted effort. it is a vital tool needed to do safe autonomous driving.
Here you go again with your false statements. I would like to ask you to stop spreading misinformation.
There's a lot more to driving a car than computer vision.
No, for FSD there is only 2 tasks:
  1. Vision - KNOW your surrounding.
    • this has to include pedestrians, pets and ALL other typical items you would see from a car
    • this also has to cover knowing the intent of each detected object (with probabilities) -- closest Tesla example would be cut in detection.
  2. Action - drive through the environment you perceive in Step 1
Step 1 never stops while the car is on.
Step 2 stops when you're at your desired destination.
 
Last edited:
  • Funny
Reactions: Daniel in SD
Here you go again with your false statements. I've would like to ask you to stop spreading misinformation.

That is NOT misinformation!! That is accepted truth by the entire autonomous industry!!!

You are the only one spreading misinformation! STOP IT!

No for FSD there is only 2 tasks:
  1. Vision - KNOW your surrounding.
    • this has to include pedestrians, pets and ALL other typical items you would see from a car
    • this also has to cover knowing the intent of each detected object (with probabilities) -- closest Tesla example would cut in detection.
  2. Action - drive through the environment you perceive in Step 1

Step 1 is super hard with just cameras. Mobileye is far ahead of Tesla and they have done but it is still not safe enough for L5. Tesla is far from even finishing this first step. And no, it is not as easy as just feeding a billion miles into the machine. If it were that easy, Tesla would have it done already.

Ask yourself: why do Tesla cars hit stopped trucks in the middle of the highway or miss a lane and run straight into crash attenuators or smash straight into a semi truck crossing in plain sight or misses a simple exit or phantom brakes for a shadow?

Answer: because computer vision is super hard. There are a ton of complex things that your camera vision has to see, recognize and measure distance with high accuracy perfectly. If you miss just one thing, you won't have safe autonomous driving.


With lidar, you can solve step 1 a lot easier. In fact, Waymo, Cruise and others have already solved step 1 with lidar.

Step 2 is actually the hardest. It is not as easy as "just drive". Drive how? When you do you cut in? When you do yield? When is it safe to make an unprotected turn in busy traffic? When to change lanes? There are a ton of complex driving scenarios that you need to solve in step 2. You need to plan your driving and be able to anticipate complex situations. You also need driving policy to implement defensive driving rules, knowing when to yield to a car that might try to cut in. There is a lot software coding that goes into step 2.

You are drastically oversimplifying autonomous driving.

Here is just one common city driving scenario that cars with lidar can handle and Tesla can't handle yet:


Under your theory, you would need computer vision to accurately track everything with high accuracy. If your computer vision does not measure distance with centimeter accuracy, your car will crash into another car. Not to mention, you need excellent driving rules for how to navigate that intersection. And this is not even the worst driving scenario.
 
Last edited:
That is NOT misinformation!! That is accepted truth by the entire autonomous industry!!!
The very first line of your reply is false.
Except for all those that have clearly stated that LIDAR is a not needed.
It would be accurate to say: That is accepted truth by those in autonomous industry that use LIDAR!

This is the same mentality that kept so many corporations hostage to IBM mainframes for decades, "we're a Big Blue shop, so if it not supported by them we don't understand it"


In fact, Waymo, Cruise and others have already solved step 1 with lidar.
The problem is solved in the same way that access to space was solved prior to SpaceX coming on the scene.

First, still have yet to see a Waymo car outside it's neat little cage.
Have not seen a Cruise car in the wild either (please don't post their demo video's, those are pretty and shiny)
Second, their solutions are not accessible to a meaningful segment of the global population. That means it will be fed to us from the top down.

No thanks!
 
Last edited:
The very first line of your reply is false.
Except for the for those that have clearly stated that LIDAR is a not needed.
It would be accurate to say: That is accepted truth by those in autonomous industry that use LIDAR!

Yes, because every leader in autonomous driving uses lidar.

Ask yourself a simple question: if lidar is so foolish and a waste of time, how come Waymo, Cruise and dozens of other companies that use lidar actually have real autonomous driving now and Tesla which does not use lidar, still has no autonomous driving yet?
 
I think if we can make computer vision almost good enough to drive a car we can make computer vision good enough to make sure the driver isn't looking at their phone.

Actually reading someones "attention level" is far harder than driving a car, which is a mechanical process compared to eliding the subtle cues that we call "attention". It's far more than just where eyes are pointing. You can be apparently looking directly ahead out of the windscreen but be off in some dream world and oblivious to your surroundings. That's why (today) most safety systems are still active .. make the human do something that says "yes, I am paying attention, and this proves it".
 
Actually reading someones "attention level" is far harder than driving a car, which is a mechanical process compared to eliding the subtle cues that we call "attention". It's far more than just where eyes are pointing. You can be apparently looking directly ahead out of the windscreen but be off in some dream world and oblivious to your surroundings. That's why (today) most safety systems are still active .. make the human do something that says "yes, I am paying attention, and this proves it".
Yep. Just like autonomous vehicles, attention monitoring doesn't need to be perfect. Determining whether or not the driver is playing a game on their phone would be a good place to start.
 
  • Like
Reactions: drtimhill
Yes they do.

This is not real autonomousdriving....

have you noticed that GM does not emphasis what sensors they use? I could only find this from 2018. They talk about gps and high def maps.

Do Cars With Super Cruise Have Lidar Scanners?

No. Super Cruise relies on GM’s lidar-mapped database that’s aboard the car, but it doesn’t add physical lidar sensors to the car, Kelly said.

are there any soon to be production cars with Lidar? Audi?