Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

U.S. opens formal safety probe for Autopilot - 2021 Aug 16

This site may earn commission on affiliate links.
All I could find is

Move Over Laws became a thing in the late 90s. The reason was that drivers were striking LEOs and FRs and their vehicles when parked on the shoulder or in lanes of traffic — long before Tesla or autopilot was a twinkle in anyone’s eye.


The laws makes sense and are fine. But there are still a large number of emergency vehicles, especially police cars, that park in a way that is not evident from a distance that they are partially blocking a travel lane. They need more education on that and should use flares and cones.
 
  • Like
Reactions: SMAlset
Sorry, but Tersla sell FSD to customers when you place an order and later customers found FSD doesn't working. I did it prove in front of Tesla Tech. 1. Navigate on autopilot- not working 2. Auto line change- not working 3.Traffic Light and Sign control- working 50-50. 4 Summon from 3 times 2 times didn't work.5 Autopark honestly didn't check. 6. Full Self Drive computer -Installed. Of course they can fool people many times but soon or later they will pay price for it. Conclusion- don't sell product which not working or in development stage.View attachment 697436

The key word here is Capability. And that’s what’s being sold if you want the option available to you in the future without rebuying another product. Just like TVs were sold with capabilities built-in that weren’t yet available to be used (ie. Cable ready). People who need a TV or car now but want it to be more future proof will spend extra money to buy something that will be more useful as technology becomes available. I know I bought my Model 3 that way.

Also pointing out no one is forced into buying it early. People do so in part thinking it will be more expensive later so will save money ultimately, and some in part are willing to buy early to help “fund” the development.

Sounds to me this is more buyer beware of what exactly you are purchasing and understanding that going in. Everyone knows the industry has been working on these advance driver systems but no one has reached FSD yet.
 
Last edited:
AEB is not designed to prevent accidents. When an accident is unavoidable, it brakes to reduce impact. That is all. AEB has nothing to do with this discussion. The discussion is about TACC/AP.

Yes, and if people read their manual they’d know this about AEB’s capability. People assume it has complete braking ability and won’t hit anything but as pointed out they are wrong and don’t understand speed, stopping distance and momentum.
 
  • Disagree
Reactions: gearchruncher
I will claim defeating is irrelevant in other brands. At least the ones I have owned. I-PACE, e-tron and ID.4 all will disengage their AP equivalent without warning sounds if the system feels for it. This only scare you once and after that you definitely keep eyes on the road and hands on wheel ever since. Any sane person will never think of trying to defeat those systems because any lost road marking will possibly, not always, disengage the system…..
So a system that disengages with no warning sounds is safer? Don’t understand that logic.

BTW I liked the comments on the forum about emergency vehicles being required to set up safety cones. Tesla and I assume other systems see and recognize cones now so maybe that should be the NHTSA’s recommendation when the come to an accident scene.
 
Last edited:
  • Like
Reactions: bhzmark
So a system that disengages with no warning sounds is safer? Don’t understand that logic.

BTW I liked the comments on the forum about emergency vehicles being required to set up safety cones. Tesla and I assume other systems see and recognize cones now so maybe that should be the NHTSA’s recommendation when the come to an accident scene.
The logic is easy: Tesla has a user interface which is suitable for level 3 autonomy but is only L2 capable. Competition is level 2 capable with a level 2 UI:
-torque sensor, not capacitive hand sensor. Holding to firm and AP disengage.
-correcting the steering will disengage - autopilot is binary on/off only
-on/off sound indicates car has control/you have control -not both at the same time
-AP will try to steer until the bitter end, other systems disengage much earlier
-when you learn the system do disengage at random, you learn to keep attention: no complacency will build up.
 
So a system that disengages with no warning sounds is safer? Don’t understand that logic.

BTW I liked the comments on the forum about emergency vehicles being required to set up safety cones. Tesla and I assume other systems see and recognize cones now so maybe that should be the NHTSA’s recommendation when the come to an accident scene.
In some countries every car is required to carry a high visibility safety triangle and put it out if they have to stop. This doesn't generally happen in the US however (never seen it in California).
 
The logic is easy: Tesla has a user interface which is suitable for level 3 autonomy but is only L2 capable. Competition is level 2 capable with a level 2 UI:
-torque sensor, not capacitive hand sensor. Holding to firm and AP disengage.
-correcting the steering will disengage - autopilot is binary on/off only
-on/off sound indicates car has control/you have control -not both at the same time
-AP will try to steer until the bitter end, other systems disengage much earlier
-when you learn the system do disengage at random, you learn to keep attention: no complacency will build up.
Back to the comment made that other car systems that disengages with no warning sounds is being safer because presumably you learn first time out it can leave you in a bad situation so you pay closer attention from there on out.

People’s minds will wonder in thought while driving regardless if their eyes are being tracked for attention to the road and looking straight ahead. People miss exits while driving normally due to this as they go “on autopilot of their own”. With no audibles and nags people won’t realize if the system has shut down on them and instead they are suddenly responsible for the car’s path and actions and are in total control.
 
Last edited:
Of course there are.


The issue is this- Teslas make up about 0.3% of all cars in the USA (1M out of 300M). They are not driven on AP that much- maybe 30% of miles. So 0.1% of all miles in the USA are driven on AP. Yet there have been 12 incidents of vehicles on AP hitting first responder vehicles. This means you'd expect 12,000 impacts out of the general population if the rate was identical on or off AP. I don't see any numbers that say thousands of first responder vehicles are struck per year, so it does appear at first glance that vehicles on AP do hit first responder vehicles at a higher rate, which does warrant deeper investigation. I mean, AP is supposed to be *safer* than a human alone, right? So we should actually be seeing 20,000+ impacts out of the general population if AP is helping at all.

Maybe with some investigation, we will find AP is fine. But it's also not just clearly some sort of witch hunt.
The miles Tesla has logged so far prove that AP+H>H (AP PLUS a generally attentive human driver H is safer than a human driver alone). That’s an important distinction from AP>H (“AP is supposed to be *safer* than a human alone”) and one that is often overlooked. Humans still intervene a lot. Apparently we are pretty good at recognizing lane encroachments, with or without flashing lights.
 
Back to the comment made that other car systems that disengages with no warning sounds is being safer because presumably you learn first time out it can leave you in a bad situation so you pay closer attention from there on out.

People’s minds will wonder in thought while driving regardless if their eyes are being tracked for attention to the road and looking straight ahead. People miss exits while driving normally due to this as they go “on autopilot of their own”. With no audibles and nags people won’t realize if the system has shut down on them and instead they are suddenly responsible for the car’s path and actions and are in total control.
With VW and Audi you never let the system drive. You drive, system just is in the background. Sometimes it turns, sometimes you turn. It assists. Travel Assist, not LKA. You can also let it drive by itself, but ought not to.

I know it is very different from AP. One really need to try it to appreciate it. The learning threshold from a Tesla is high.
 
again, I will do my harping on v2x tech.

imagine if all 'important do not hit me, please!' vehicles were sending out beacons, or even master nodes that 'know' about certain things, they send out the beacons. any car nearby or even advancing could know of the presence of things like this.

all very do-able.

IF we would just decide we want it.

(sigh. I hate there being tech that people refuse to use. its like..... no.... I wont say it. nope, not here.)
tech is not the problem. it's called 'people'
you can lead horses to water....
 
The miles Tesla has logged so far prove that AP+H>H (AP PLUS a generally attentive human driver H is safer than a human driver alone). That’s an important distinction from AP>H (“AP is supposed to be *safer* than a human alone”) and one that is often overlooked. Humans still intervene a lot. Apparently we are pretty good at recognizing lane encroachments, with or without flashing lights.
sorry, 'miles logged' isn't enough.
the actuarial tables by insurance companies establishing risk comprise 100 years of experience. there's no equivalence for FSD, AP, or whatever.
the legislation covering responsibilities and liabilities need updates to cover 'automated driving' - ie, who's at fault?
is the use of AP, FSD, etc absolution from moral responsibilities? People get maimed and killed. it's not enough to say 'the driver should've been paying attention'.

FSD is a noble goal but given the consequences (and the exaggerated behaviors of people bragging on their new toy), a higher standard than 'better than human' is required.
Technology is not a religion. It is a tool. Wisdom is required, and that can be difficult to find.
 
Last edited:
I thought NHTSA uniquely hated Tesla and was paid for by big legacy manufacturers to harass them over what is clearly an L2 system where only the idiot drivers can ever be at fault?
For so long as the Media and Elon tout AP and FSD as a 'human safety' feature, with fanboys dumping 'hold my beer' videos on YouTube, the legacy manufacturers thru their captured NHTSA will seek rents. Otherwise known as fees, fines, penalties and in short: MONEY. That's all any of the NHTSA activities are about.
 
That's all any of the NHTSA activities are about.
That is one hot take for the organization that manages the FVMSS for the whole USA and has mandated all sorts of safety technologies and recalls over the years that have cost billions from manufacturers. NHTSA exists purely so the "legacy" manufacturers can block newcomers?

Can you show where the NHTSA has ever fined Tesla?

Ironically, NHTSA DOES fine manufactures that fail to meet emissions / MPG standards...

with fanboys dumping 'hold my beer' videos on YouTube
The irony being that at least here at TMC, it's the fanboys saying "YOU MUST PAY ATTENTION ALL THE TIME" and giving FSD a pass when it does something wrong because it's "clearly an L2 system".
 
sorry, 'miles logged' isn't enough.
the actuarial tables by insurance companies establishing risk comprise 100 years of experience. there's no equivalence for FSD, AP, or whatever.
the legislation covering responsibilities and liabilities need updates to cover 'automated driving' - ie, who's at fault?
is the use of AP, FSD, etc absolution from moral responsibilities? People get maimed and killed. it's not enough to say 'the driver should've been paying attention'.

FSD is a noble goal but given the consequences (and the exaggerated behaviors of people bragging on their new toy), a higher standard than 'better than human' is required.
Technology is not a religion. It is a tool. Wisdom is required, and that can be difficult to find.
To be fair, that’s why Elon is targeting “10x better than a human driver alone,” which should translate to fewer people being maimed or killed. Even the 10x target may not be enough, because the comparison is fraught with logical flaws:
1. It compares death and injury statistics for Teslas, the safest cars ever built, with ALL cars
2. It compares mostly highway miles on AP with more accident prone rural and city driving of humans only
3. The control group’s average is dragged down by drivers who are drunk, asleep, enraged, too old or too young, etc (an unimpaired motorist may already be 10x safer than the statistically “average” human driver)
4. Most important is the issue I mentioned before: we are so far not comparing FSD to human drivers alone, we are comparing FSD PLUS a human overseer who has agreed to remain in control at all times, and who provides an unknown number of interventions to keep FSD working as well as the data shows.

Even so, my take is that the system is getting there and the “March of 9s” will validate Tesla’s vision-only system way before anyone else’s is viable.
 
To be fair, that’s why Elon is targeting “10x better than a human driver alone,” which should translate to fewer people being maimed or killed. Even the 10x target may not be enough, because the comparison is fraught with logical flaws:
1. It compares death and injury statistics for Teslas, the safest cars ever built, with ALL cars
2. It compares mostly highway miles on AP with more accident prone rural and city driving of humans only
3. The control group’s average is dragged down by drivers who are drunk, asleep, enraged, too old or too young, etc (an unimpaired motorist may already be 10x safer than the statistically “average” human driver)
4. Most important is the issue I mentioned before: we are so far not comparing FSD to human drivers alone, we are comparing FSD PLUS a human overseer who has agreed to remain in control at all times, and who provides an unknown number of interventions to keep FSD working as well as the data shows.

Even so, my take is that the system is getting there and the “March of 9s” will validate Tesla’s vision-only system way before anyone else’s is viable.
What if only drunk and impaired drivers use FSD? The bar is lower or higher based on what is being replaced.
 
Now, NHTSA are complaining about Tesla not issuing a recall notice for a "safety defect":


In my opinion, it is not a safety defect but rather an improvement of the existing system. No cars in the world has detection of emergency vehicle lights for their cruise-controls... If now Tesla is implementing additional safety features to the cars, why would that require a recall notice?

Threats of large fines won't help the auto-industry to continuously innovate and make cars safer. Somehow, NHTSA have not fully understood that the current systems in a Tesla are drive assist systems and not fully autonomous...

We all need to continue spreading the word that Tesla cars are currently Level 2, reporting full attention by the driver at all times....
 
Now, NHTSA are complaining about Tesla not issuing a recall notice for a "safety defect":


In my opinion, it is not a safety defect but rather an improvement of the existing system. No cars in the world has detection of emergency vehicle lights for their cruise-controls... If now Tesla is implementing additional safety features to the cars, why would that require a recall notice?

Threats of large fines won't help the auto-industry to continuously innovate and make cars safer. Somehow, NHTSA have not fully understood that the current systems in a Tesla are drive assist systems and not fully autonomous...

We all need to continue spreading the word that Tesla cars are currently Level 2, reporting full attention by the driver at all times....

NHTSA clearly doesn't work in 2-week sprints :D Imagine having to issue a recall every 2 weeks for the entire fleet.
 
  • Like
Reactions: MP3Mike and DaSwede
NHTSA clearly doesn't work in 2-week sprints :D Imagine having to issue a recall every 2 weeks for the entire fleet.
It probably is worse than that: I don't think NHTSA are prepared for any additions of features over-the-air that will increase safety of the mobile fleet of cars. They are still in the mindset that if you "retrofit" anything to safety components in a car, it has to be that you had a defect in the previous release. Maybe they need to bring in a new notice which publicly reports extended safety features that are being implemented and made available to the whole fleet rather than requesting a notice for a "safety defect".

That would work in the favour of Tesla as they are one of the few car manufacturers that continuously are improving cars through over-the-air deliveries.
 
  • Like
Reactions: nvx1977