Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I didn’t think FSD has ever hit an emergency vehicles. Those were either on autopilot or claimed to be on autopilot and later proved false.
I would assume the static object collisions they are seeing are from simulation.

Remember that FSD has done a very small number of miles - just over 1 billion so far, most of which were on the freeway. So we're dealing with extremely small sample sizes. That's why simulation has value. There are even statistical techniques to draw out the worst case scenarios with relatively few trials. Not sure whether they do that but in any case they can look closely at many static object scenarios and challenging corner cases with simulation.

Presumably the simulation results are what Elon was referencing.
 
I don’t think this is a good example at all since we currently have no idea what was in use.
Given that the caption for the photo in the news article says full self driving mode was in use, I have an idea what was in use. I'm thinking the full self driving mode.

My own experience with FSD is that it is the most likely of the driver assist modes to cause an accident, so I have no reason to disbelieve the news report. The fact that this thread has gone on for about 400 pages, with countless examples of lies and distortions about fsd all along, further suggests that fsd may be a far cry from a functional fsd (i.e. level 5) system.
 
Given that the caption for the photo in the news article says full self driving mode was in use, I have an idea what was in use. I'm thinking the full self driving mode.
“The Tesla driver admitted he was operating the vehicle in self-driving mode while using his cellphone”

This does not say anything about full self-driving.

I am extremely skeptical of FSD safety, since no one has ever published any data on how safe it is, but the media and people in general are so completely clueless about the various levels of driver assist that we cannot just take an early report like this as truth.

We will see. I would not be surprised if FSD was in use. I would also not be surprised if only Autosteer was in use.

But it is not a documented example of FSD hitting an emergency vehicle. It’s pointless to discuss further until there is more information available. It will certainly qualify as a report per the SGO.
 
  • Like
Reactions: aronth5
What a curious way to rephrase that statement. It went from "We're reducing collisions" to "We're admitting a known problem". This is why companies use meaningless corporate-speak when they say anything in public.
I think that is a very good point.....given a binary choice, what do we want from the CEO...enthusiast overly optimistic...or....meaningless corporate-speak ?
 
  • Like
Reactions: JB47394
Any good educated guesses about what exactly Elon means when he says they're "polishing" or "smooth[ing]" out a point release?
Probably because 12.3 was the first wide release of 12.x, the earlier 12.0 initial end-to-end training with 12.1, 12.2 polishing are probably more similar to 12.4.0 additional data training and 12.4.1, 12.4.2 polishing. Each time there's significant focused training such as on situations that required 12.3.x disengagements, there's potential for regressions in "don't care" / "not actually intervention" situations such as just staying in lane.

The smoothing out of behavior could be quite literally getting 12.4.x to drive more smoothly instead of false positive / hallucinations of "this must be a situation that requires an action" that it was trained on aka overfitting. Similar to how Tesla doesn't want to overtrain on easy / common case examples as neural networks would learn that it should basically always go straight, focused training on problematic scenarios makes the neural network think it needs to address problems more frequently than it actually needs to.

This is also likely where the 5x-10x improvement in miles per intervention came from as 12.4.x has learned to avoid these problems previously requiring interventions with 12.3.x. While unnecessary lane changes generally are annoying, the rate of interventions is probably lower than actual safety situations as 12.4.x can still get into the correct lane, and additional data collection and finetuning of 12.4.x models should help too.
 
That is NOT an example since there is 0 evidence other than an inaccurate at best statement by the driver at the scene and clickbait for a news site. The driver doesn't even state it was Full Self Driving and it is FAR more likely that either nothing was not engaged and the driver said this to cover their a$$ (they we're just looking at their phone and not paying attention) or it was AP. Also there is no such thing a Tesla Self Driving Mode so no matter what the article is WRONG. You stated FSD hits stopped emergency vehicles so try again and offer proof that FSD HAS hit a stopped emergency vehicle.

"...The Tesla driver admitted he was operating the vehicle in self-driving mode...."
 
The driver doesn't even state it was Full Self Driving and it is FAR more likely that either nothing was not engaged and the driver said this to cover their a$$ (they we're just looking at their phone and not paying attention) or it was AP.
It sounds like hitting a parked emergency vehicle while on TACC or AP is a pass for Tesla. Its not. Of course, unfortunately people lie but was some form of driver assist enabled or not? If yes, what can Tesla reasonably do to help mitigate future situations like these?

Yes, the driver is 100% responsible for this accident.
 
It sounds like hitting a parked emergency vehicle while on TACC or AP is a pass for Tesla. Its not. Of course, unfortunately people lie but was some form of driver assist enabled or not? If yes, what can Tesla reasonably do to help mitigate future situations like these?

Yes, the driver is 100% responsible for this accident.
If Telsa says that TACC/AP/EAP may not stop for parked vehicles (which I am pretty sure they still say), are people right for complaining that it won't stop?

In this case, if they were using Navigate on City Streets, it would be interesting to see what the AP computer was percieving right before the collision. I cannot speak for everyone, but I've not had my car even look like it was going to run into the back of a parked vehicle, at night, while on city streets, yet.
 
  • Disagree
Reactions: willow_hiller

NHTSA is (was?) investigating crashes of Teslas on Autopilot crashing into stationary emergency vehicles.
Yes. This is Autopilot.

There is not any question that Autopilot/Autosteer crashes into stationary emergency vehicles. Seems very well established by telemetry.

But not what is being discussed.

I think FSD does as well, on occasion, but don’t recall any evidence supporting that. I assume Tesla has plenty of evidence from simulation and perhaps telemetry.

I cannot remember what info the SGO event summary table includes.
 
Last edited:
  • Like
Reactions: rlsd and JulienW
It sounds like hitting a parked emergency vehicle while on TACC or AP is a pass for Tesla. Its not...
Where on earth did you get that preposterous idea from reading my post that I'm giving Tesla a pass? Simple, if it is AP or TACC then it was NOT FSD. This is about assigning the proper responsibility to the system that was in use.

You can't say FSD is hitting parked emergency vehicles if FSD its not in use.
 
Given that the caption for the photo in the news article says full self driving mode was in use, I have an idea what was in use. I'm thinking the full self driving mode.
I'm thinking virtually nobody in the media understands the distinctions between the various levels of driver assistance available on various Teslas, and that they make no distinction between base autopilot and full self driving.The photo caption is meaningless IMO.
 
I cannot remember what info the SGO event summary table includes.
There's information about the incident month/year, city, roadway, crash partner, injury, automation system engaged, source of reporting, etc. You can download the csv and pdf defining fields from NHTSA SGO data section. For example, there's an entry for Tesla July 2022 in Gainesville Parking Lot crash with a Parked Heavy Truck and fatality reported as ADAS (as opposed to ADS) by telematics, law enforcement and media.

Here's an example article about the crash with speculation of Autopilot usage and tags including "#autopilot #full self-driving"

After months of investigation, this 2015 Model S didn't even have Autopilot equipped.