Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

U.S. opens formal safety probe for Autopilot - 2021 Aug 16

This site may earn commission on affiliate links.
If you look at EuroNCAP videos, no cars are able to stop safely for the stationary car dummy at highway speeds. Radar and camera range seem to low. The system should brake and reduce severity though, not just plow on.

Then you have the complacency issue. That is Tesla's biggest problem. The system user interface is designed faulty and thus make the driver complacent. The car tells driver "Pling I drive" until "plong you drive". But the human can't drive together with the car, it is binary. You can't give small corrections, the car will then abort "take your hands away or I will cancel". Car takes full control and people trust the car to much.That could work for a Level 3 system, not a Level 2.
That's the way an autopilot in a plane works as well. You generally don't "share" although you can decide what to turn over to the autopilot, steering, altitude, both, etc...
 
I tried to find details for all of the collisions listed, but couldn't find one at all. Another I found what I think is it, but about two weeks earlier...

IncidentURLTimeNotes
07/10/2021 San Diego CA
3:09 AM​
DUII
05/19/2021 Miami FL
5:15 AM​
03/17/2021 Lansing MI
1:10 AM​
Suspended license
02/27/2021 Montgomery County TX
1:15 AM​
DUII
08/26/2020 Charlotte NC Early morningWatching movie
07/30/2020 Cochise County AZ
7/14/2020?​
DUII
01/22/2020 West Bridgewater MA
12/29/2019 Cloverdale IN
8:00 AM​
12/10/2019 Norwalk CT
12:40 AM​
05/29/2018 Laguna Beach CA
11:07 AM​
01/22/2018 Culver City CA
8:30 AM​

Anybody have any luck finding information on the West Bridgewater incident?

Most of them seem to be in the wee hours of the morning, at least 3 with DUII, a suspended license, and someone watching a movie...

Of course it would be helpful to know if each was an AP1 car, or if it was newer... But those details will eventually come out.
 
Last edited:
those are are called effing user errors! wow. are u kidding me? where does it say you get to do any of those things and still have a license?? in any car??

good grief i love me some witch hunting...
Some people place far too much faith in AP and clearly don't consider/read the caveats about "be ready to take control" seriously. But, they are SERIOUS. I'd never consider watching a movie in my M3 under AP. On the flip side, if you were to fall asleep at the wheel AP could well save your life, something I hope the NHTSA will consider. There are two sides to this equation.
 
This is probably why Tesla is hesitant to send the FSD Beta to the general population.
Sorry, but Tersla sell FSD to customers when you place an order and later customers found FSD doesn't working. I did it prove in front of Tesla Tech. 1. Navigate on autopilot- not working 2. Auto line change- not working 3.Traffic Light and Sign control- working 50-50. 4 Summon from 3 times 2 times didn't work.5 Autopark honestly didn't check. 6. Full Self Drive computer -Installed. Of course they can fool people many times but soon or later they will pay price for it. Conclusion- don't sell product which not working or in development stage.
FSD.jpg
 
Sorry, but Tersla sell FSD to customers when you place an order and later customers found FSD doesn't working. I did it prove in front of Tesla Tech. 1. Navigate on autopilot- not working 2. Auto line change- not working 3.Traffic Light and Sign control- working 50-50. 4 Summon from 3 times 2 times didn't work.5 Autopark honestly didn't check. 6. Full Self Drive computer -Installed. Of course they can fool people many times but soon or later they will pay price for it. Conclusion- don't sell product which not working or in development stage.
SAD - Some Assisted Driving. If you would like to participate in the experiment that will be $10k please.
 
I tried to find details for all of the collisions listed, but couldn't find one at all. Another I found what I think is it, but about two weeks earlier...

IncidentURLTimeNotes
08/26/2020 Charlotte NC Early morningWatching movie

All charges dropped against Dr. Devainder Goli who was on Autopilot when he crashed into a police car injuring two policemen while "watching TV while driving and failing to move over."
The article does not say if they charged Autopilot criminally. But of course they didn't. I guess nobody is to blame anymore for these incidents, at least for doctors in NC.
 
All charges dropped against Dr. Devainder Goli who was on Autopilot when he crashed into a police car injuring two policemen while "watching TV while driving and failing to move over."
The article does not say if they charged Autopilot criminally. But of course they didn't. I guess nobody is to blame anymore for these incidents, at least for doctors in NC.

I think Investigations are always needed and should be a good thing, especially when new tech can have such great potential of both good and bad.

The problem is the archaic way governement bodies do this. It will be YEARS until they reach a conclusion. And at that point the tech will have become obsolete. Unless of coarse this is political and they just want to throw Tesla under the bus, or fire truck….

bottom line there should be no reason a Tesla hits a stationary object. Period. I know the radar thing yes but the vision and us sensors should make it not possible.
It’s crazy how my Tesla drives itself all over busy roads and cities so we’ll but if I’m heading straight into a wall it will just let me.
I’m hoping with the new vision based AP it will be better.
 
Of course the main solution is for the driver to pay attention

I disagree with this. Waymo learned *LONG* ago that this would be a major problem... lulling users into a false sense of security will cause fatalities. In my opinion, cars should be either *fully autonomous*, or *fully manual*. The idea that we can let 4 tons of metal drive itself around with "beta" software is really double-plus-ungood.

I really, really, super appreciate my Model 3 and think it might just be humanity's best automobile... but "FSD" is a problem, and it seems inevitable that it will become a major financial problem for Tesla at some point.
 
That's the way an autopilot in a plane works as well. You generally don't "share" although you can decide what to turn over to the autopilot, steering, altitude, both, etc...
Yes, probably works in aviation but then more as a Level 3 - you have multiple seconds to take over. But ground traffic is a tad more complex, and Tesla AP is level 2.

Tesla need to force more driver vigliance, not just tell them to stay alert.
 
  • Like
Reactions: drdumont
those are are called effing user errors! wow. are u kidding me? where does it say you get to do any of those things and still have a license?? in any car??

good grief i love me some witch hunting...
They should design the system so that user error do not result in serious consequence. This is basic in modern safety theory. Because humans do err.
 
I disagree with this. Waymo learned *LONG* ago that this would be a major problem... lulling users into a false sense of security will cause fatalities. In my opinion, cars should be either *fully autonomous*, or *fully manual*. The idea that we can let 4 tons of metal drive itself around with "beta" software is really double-plus-ungood.

I really, really, super appreciate my Model 3 and think it might just be humanity's best automobile... but "FSD" is a problem, and it seems inevitable that it will become a major financial problem for Tesla at some point.

people used to be similarly scared of autonomous elevators that didn’t require an operator, and also of basic cruise control, and even TACC, and now lane-keeping.

It is silly argue from empty assertions about “false sense of security”. The insurance companies have the real data on the frequencies of crashes and injuries and any coorelations with auto technologies in the various cars. With a big enough sample size that is where the real data is. Tesla also has the data,but of course only on their cars, but at least it will distinguish between using and not using the various technologies.

Anecdotally, my experience with AP (from when it was first deployed in October 15, and as it has developed) is that even from the beginning by relieving the driver of the microdecisions and micro-inputs of lane-keeping and car-following it allows the driver to be more attentive to other issues on the road and should be safer for those drivers who use properly. Yes there will be those who misuse, but as long as it is safer in the aggregate it is a good technology to have. Some people may get strangled by their seat belts, but it doesn’t mean we should get rid of seat belts.
 
It appears to me that all the foofaraw about Autopilot (TACC) and FSD (The Unfinished BETA software being touted as the be all end all for automobile driving) once again attracts all the FUDI and prejudice against Tesla vehicles, whatever the motivation.
I haven't read the report, but I ask if it speaks of TACC (The default Cruise Control and Lanekeeping software), or FSD (see above). So many folks still don't understand the difference.
But is appears to me that whatever the system in question, people seem to be missing the main point, repeated everywhere - to paraphrase - The driver MUST keep hands on the wheel and be able to TAKE CONTROL at an instant. Use of either Autopilot or FSD does NOT entitle you to read, play games, take a nap, or ignore the fact that THE DRIVER is still in control and responsible for the safe operation of one of the most potentially deadly weapons on Earth.
I encounter emergency situations weekly, at least. Sirens, flashing lights, barriers, fire engines across lanes, even smoke rising ahead alert me that there is something untoward happening and maybe I just might want to be alert to something amiss.
An automatic response since I started driving in 1962 is to slow down, turn down the radio, and lower the window when I encounter evidence of an untoward happenstance ahead. and nowadays I disengage any automatics as well. An old fuddy duddy? Mayhap. But safe and still intact, fuddy duddy or not.
I still rely on the good ol' Mark I Mod I calibrated brain and associated systems first. At 74, it seems to have worked for me.
 
people used to be similarly scared of autonomous elevators that didn’t require an operator, and also of basic cruise control, and even TACC, and now lane-keeping.

It is silly argue from empty assertions about “false sense of security”. The insurance companies have the real data on the frequencies of crashes and injuries and any coorelations with auto technologies in the various cars. With a big enough sample size that is where the real data is. Tesla also has the data,but of course only on their cars, but at least it will distinguish between using and not using the various technologies.

Anecdotally, my experience with AP (from when it was first deployed in October 15, and as it has developed) is that even from the beginning by relieving the driver of the microdecisions and micro-inputs of lane-keeping and car-following it allows the driver to be more attentive to other issues on the road and should be safer for those drivers who use properly. Yes there will be those who misuse, but as long as it is safer in the aggregate it is a good technology to have. Some people may get strangled by their seat belts, but it doesn’t mean we should get rid of seat belts.
And you know what? I have yet to hear of an autonomous elevator which crashed into another elevator...
 
  • Funny
Reactions: rxlawdude and Cal1
I tried to find details for all of the collisions listed, but couldn't find one at all. Another I found what I think is it, but about two weeks earlier...

IncidentURLTimeNotes
07/10/2021 San Diego CA
3:09 AM​
DUII
05/19/2021 Miami FL
5:15 AM​
03/17/2021 Lansing MI
1:10 AM​
Suspended license
02/27/2021 Montgomery County TX
1:15 AM​
DUII
08/26/2020 Charlotte NC Early morningWatching movie
07/30/2020 Cochise County AZ
7/14/2020?​
DUII
01/22/2020 West Bridgewater MA
12/29/2019 Cloverdale IN
8:00 AM​
12/10/2019 Norwalk CT
12:40 AM​
05/29/2018 Laguna Beach CA
11:07 AM​
01/22/2018 Culver City CA
8:30 AM​

Anybody have any luck finding information on the West Bridgewater incident?

Most of them seem to be in the wee hours of the morning, at least 3 with DUII, a suspended license, and someone watching a movie...

Of course it would be helpful to know if each was an AP1 car, or if it was newer... But those details will eventually come out.
The doctor in NC had a 2015 Model S so that would mean it only had AP1.
 
  • Informative
Reactions: SeminoleFSU