Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

SF Bay Accident Surveillance Footage

This site may earn commission on affiliate links.
Thanks for confirming, I see Dirty Tesla claims the same, that if NOA was making the lane change, it would be 6 blinks, but only 1-2 blinks if driver initiated it.
FWIW - Today when using FSD I counted the number of blinks before my car started changing lanes. The shortest was 1 blink and then started changing during the 2nd blink. Often times it completed the entire lane change in less than 6 blinks. In fact most times it started moving before the 3rd blink. The only time it would take longer is if there was a car in the lane next to me fairly close. Closer than the video shown for this incident.

I also counted on the freeway using NOA. The shortest was 3 blinks before starting the lane change, with no vehicles behind. I did not have a case with cars about the same distance as the video. So that may have taken more blinks.
 
They've lived up to their promises as they were presented??? I do not think that things are moving in the right direction. The moment they started removing all kinds of sensors and going with "vision only" (most likely due to supply shortages and profits and not their belief in vision), they lost me. You do not know that our cars will eventually reach that point. There is zero guarantee that will happen.
Any time you try to fuse data from different types of sensors, you're adding potential for misbehavior. And from what I can tell, a lot of phantom braking was caused by the RADAR data being garbage. So I wouldn't assume that reducing the number of sensor types represents motion away from the goal. From all indications, the reverse is likely true.

Additionally, Tesla was never going to achieve anything approaching FSD with their existing RADAR. It was moderately useful as a stopgap on the highway, but its resolution is useless for determining the speed of small objects in an urban environment. Coming up with an approach for determining whether small-ish objects are moving towards or away from you visually was always a mandatory step in reaching an appropriate level of safety. Even if their sensor fusion had been perfect, absent adequate vision-based object vector detection, they still would have needed to move to high-definition RADAR or LIDAR to get the job done. And that has been obvious for a long time.

I am curious about the rumors of Tesla adding RADAR back some time this month. If they're adding HD RADAR, then my guess is that this is trying to make FSD usable in rain. (It currently gives up on me even with the wipers on intermittent, which is way worse than highway FSD, and if V11 brings that level of poor performance to the highway, it is going to be a disaster.) If so, that's probably a reasonable thing to do.

Either way, the existing RADAR was crap, and getting rid of it was no great loss, IMO.

I'm a little bit less confident about their removal of the ultrasonic sensors, but we'll see.

Things are not moving the right direction. i have seen no plan to get aspiration FSD approved by any regulator.
IMO, that's a bit like complaining when they start closing in the roof of a building that there are no plans for getting an inspector to sign off on the electrical permits.....

The non-city-street stack has always been a temporary band aid, and the plan has always been to get some future version of a city streets stack approved. They've just gotten it to the point where it doesn't completely drive like a drunken 15-year-old with a learner's permit very often, so of course there's no plan to try to get approvals yet. 😁

But that doesn't mean that they aren't moving in the right direction. What they have done, over the course of several years, is add functionality moving them towards the features that will eventually be required for certification.
  • When I got my Model X in late 2017, Autopilot was a disaster. It didn't just have phantom braking. It ping-ponged all over the darn road unless the road was pretty close to straight.
  • Within a year, it had navigate-on-autopilot, but the lane keeping was still just barely passable.
  • Within another year or two, it was mostly handling CA-17 in a survivable manner, and (after a free hardware upgrade) was stopping for traffic lights and stop signs, albeit with assistance deciding when to continue.
  • Now, it can navigate a lot of exits that it couldn't handle before, and has at least some city street driving, including making turns (sometimes), handling roads without lane lines, dodging pedestrians and bicycles, etc.
Each of those things represents a significant step forwards in the technology and in its capabilities. And the underlying hardware and software tech has also made significant steps forwards in that time.

With million of miles and years of data, Tesla with FSD is still more expensive to insure. i thought with fewer parts, giga presses it would be cheaper to make and repair.
Tesla cars are more expensive to repair largely because Tesla can't even build enough parts to keep up with manufacturing demand. Tesla cars also have a higher rate of break-ins.

The beta test pilots are taking massive financial risk or hurting or killing innocent victim, with no compensation except you might one day get the product you paid for years ago. I never score above 90 in my beta testing but I was still given access to the beta. I thought it was about safety, or was it just another delay, Early adopters will be left behind at the Tesla meeting later this month.
How do you figure? FSD beta started rolling out to MCU1 cars with HW3 back in December. Mine has been running it for eight days (during which it was raining almost continuously, so I've barely been able to test it at all, because FSD beta basically won't turn on in the rain, but it is still running FSD beta firmware with the feature turned on.)

the CEO is the largest seller of Tesla stock!, not a sigh of brand confidence, especially when he believes twitter is a better investment.
The CEO of almost every company is the largest seller. From those to whom much stock is given, much selling is expected. 🤣

when will full self driving, not mean you are 100% responsible for the safe operation of your vehicle.
Three months maybe. Six months definitely. 🤣

But seriously, you're asking the wrong question. The correct question is when FSD will actually have all of the required features for city street driving. Until every required feature is in place, wondering when they'll be perfect enough to blindly hand over control is an absurd question to ask. And right now, I'm pretty sure they do not. In particular, I don't think FSD even attempts to handle:
There are probably a lot of other missing capabilities as well, particularly when it comes to region-specific traffic rules. If they were willing to geofence it, they could probably get something good enough pretty quickly in certain places where they have giant piles of driver data, but that's very different from doing it in the more general case.

There's probably a secret plan somewhere that lists all of those features and prioritizes them. When that list is empty, then you can start talking about how long it will be before all the features are reliable enough to consider allowing the human to not be in control. As long as it can't even avoid stopping in the middle of a railroad track or violating simple posted traffic signs, it isn't even close enough to that point to start asking "when". But it is getting closer with each added feature.
 
Last edited:
Any time you try to fuse data from different types of sensors, you're adding potential for misbehavior. And from what I can tell, a lot of phantom braking was caused by the RADAR data being garbage. So I wouldn't assume that reducing the number of sensor types represents motion away from the goal. From all indications, the reverse is likely true.

Additionally, Tesla was never going to achieve anything approaching FSD with their existing RADAR. It was moderately useful as a stopgap on the highway, but its resolution is useless for determining the speed of small objects in an urban environment. Coming up with an approach for determining whether small-ish objects are moving towards or away from you visually was always a mandatory step in reaching an appropriate level of safety. Even if their sensor fusion had been perfect, absent adequate vision-based object vector detection, they still would have needed to move to high-definition RADAR or LIDAR to get the job done. And that has been obvious for a long time.

I am curious about the rumors of Tesla adding RADAR back some time this month. If they're adding HD RADAR, then my guess is that this is trying to make FSD usable in rain. (It currently gives up on me even with the wipers on intermittent, which is way worse than highway FSD, and if V11 brings that level of poor performance to the highway, it is going to be a disaster.) If so, that's probably a reasonable thing to do.

Either way, the existing RADAR was crap, and getting rid of it was no great loss, IMO.

I'm a little bit less confident about their removal of the ultrasonic sensors, but we'll see.


IMO, that's a bit like complaining when they start closing in the roof of a building that there are no plans for getting an inspector to sign off on the electrical permits.....

The non-city-street stack has always been a temporary band aid, and the plan has always been to get some future version of a city streets stack approved. They've just gotten it to the point where it doesn't completely drive like a drunken 15-year-old with a learner's permit very often, so of course there's no plan to try to get approvals yet. 😁

But that doesn't mean that they aren't moving in the right direction. What they have done, over the course of several years, is add functionality moving them towards the features that will eventually be required for certification.
  • When I got my Model X in late 2017, Autopilot was a disaster. It didn't just have phantom braking. It ping-ponged all over the darn road unless the road was pretty close to straight.
  • Within a year, it had navigate-on-autopilot, but the lane keeping was still just barely passable.
  • Within another year or two, it was mostly handling CA-17 in a survivable manner, and (after a free hardware upgrade) was stopping for traffic lights and stop signs, albeit with assistance deciding when to continue.
  • Now, it can navigate a lot of exits that it couldn't handle before, and has at least some city street driving, including making turns (sometimes), handling roads without lane lines, dodging pedestrians and bicycles, etc.
Each of those things represents a significant step forwards in the technology and in its capabilities. And the underlying hardware and software tech has also made significant steps forwards in that time.


Tesla cars are more expensive to repair largely because Tesla can't even build enough parts to keep up with manufacturing demand. Tesla cars also have a higher rate of break-ins.


How do you figure? FSD beta started rolling out to MCU1 cars with HW3 back in December. Mine has been running it for eight days (during which it was raining almost continuously, so I've barely been able to test it at all, because FSD beta basically won't turn on in the rain, but it is still running FSD beta firmware with the feature turned on.)


The CEO of almost every company is the largest seller. From those to whom much stock is given, much selling is expected. 🤣


Three months maybe. Six months definitely. 🤣

But seriously, you're asking the wrong question. The correct question is when FSD will actually have all of the required features for city street driving. Until every required feature is in place, wondering when they'll be perfect enough to blindly hand over control is an absurd question to ask. And right now, I'm pretty sure they do not. In particular, I don't think FSD even attempts to handle:
There are probably a lot of other missing capabilities as well, particularly when it comes to region-specific traffic rules. If they were willing to geofence it, they could probably get something good enough pretty quickly in certain places where they have giant piles of driver data, but that's very different from doing it in the more general case.

There's probably a secret plan somewhere that lists all of those features and prioritizes them. When that list is empty, then you can start talking about how long it will be before all the features are reliable enough to consider allowing the human to not be in control. As long as it can't even avoid stopping in the middle of a railroad track or violating simple posted traffic signs, it isn't even close enough to that point to start asking "when". But it is getting closer with each added feature.
This is never happening with existing hardware. And tell that to my vision only Tesla that would randomly slam on the brakes on the Highway for no reason. Tesla is now famous for removing sensors and other parts and claiming it’s for a Vision future. Ok.
 
  • Like
Reactions: kabin
This is never happening with existing hardware. And tell that to my vision only Tesla that would randomly slam on the brakes on the Highway for no reason. Tesla is now famous for removing sensors and other parts and claiming it’s for a Vision future. Ok.
Yes. The current phantom braking issues are totally on the AI team and the vision only design. Although the occurrence of large magnitude phantom braking blunders are arguably improving, the frequency of lower magnitudes are present for me in almost every drive and some drives I experience multiple occurrences.

It's no wonder the team couldn't handle one radar design when, after how many years, they still haven't tamed their AI/NN specialty.
 
  • Like
Reactions: SouthFLGuy
I use even the basic AP on highway and dont have any braking issues so far, even driving in the slow lane and having cut ins and outs frequently. The only issue I do have is the lane centering issue talked about in this sub forum when the lane widens and it wants to swerve right to stay center. I assume this will be fixed in v11+ with the single stack.
 
  • Funny
Reactions: AlanSubie4Life
This is never happening with existing hardware.
Based on what?

I mean, I kind of suspect that they might end up needing stereoscopic cameras out both ends of each bumper to gauge cross traffic in directions that the front-facing camera can't adequately see. However, if you ignore unprotected turns and parking lots, I suspect that the existing camera hardware is probably good enough. And I'm not even certain about those; it's just a gut feeling.

The existing neural processors might or might not be up to the task, but it is still way too soon to say for sure. I seem to recall reading that HW3 isn't currently able to provide proper redundancy right now, but that doesn't necessarily mean they won't be able to optimize parts of it to get it to that point. We won't really know for sure until they either A. hit a wall where they can't do something on the existing hardware (which obviously means it isn't good enough) or B. get it working reliably enough that they're ready to focus their efforts on optimizing (at which point they'll either pull it off or conclude that it isn't possible.

And tell that to my vision only Tesla that would randomly slam on the brakes on the Highway for no reason.
So do non-vision-only Teslas. This likely has more to do with incorrect path planning and lane detection/prediction than with anything hardware-related.

Bear in mind also that you're talking about the legacy highway stack. (I'm assuming you're not a Tesla employee, and thus aren't running FSD beta version 11.x.) That basically hasn't changed much at all for several years (apart from occasionally pulling in minor features like traffic light support from the FSD beta stack), because they're spending all of their development efforts on the new FSD beta stack.

When the FSD beta stack replaces the highway stack globally, those problems will probably go away. But that won't happen until FSD beta V11 goes to wide release (and probably not until they've made sure that they don't have a huge uptick in problems with that wide release).

Yes. The current phantom braking issues are totally on the AI team and the vision only design.
Bearing in mind that the threshold for acceptance was almost certainly "not significantly worse than RADAR-equipped cars". Obviously that meant doing some work on the input portion of the legacy highway stack by creating their pseudo-LIDAR code to figure out the position, direction, and speed of random objects out there in the world. But that would be where they stopped — with the input code. All the path planning of the legacy highway stack is still just as primitive as it is on RADAR-equipped cars.

If the problems don't go away with the V11 unification, then those issues are totally on the AI team and the vision-only design. Up until that point — as long as you're basically just using that work as inputs into a four-year-old highway-only stack — the issues are predominantly on the legacy highway stack, and you shouldn't expect that to improve until V11 gets rolled out and unifies the driving stack.

It's no wonder the team couldn't handle one radar design when, after how many years, they still haven't tamed their AI/NN specialty.
AFAIK, they quite literally haven't been touched it since roughly the Tesla Software V9 release in 2018 (beyond integrating the vision-only front end so that they could drop the RADAR hardware). They probably could have solved the problem, but only by stealing programmer resources away from their work on the replacement stack that would eventually turn into FSD beta. I suspect if they had realized it wouldn't make it into production for four years, they might have done so, but hindsight is 20/20. :cool:
 
Based on what?

I mean, I kind of suspect that they might end up needing stereoscopic cameras out both ends of each bumper to gauge cross traffic in directions that the front-facing camera can't adequately see. However, if you ignore unprotected turns and parking lots, I suspect that the existing camera hardware is probably good enough. And I'm not even certain about those; it's just a gut feeling.

The existing neural processors might or might not be up to the task, but it is still way too soon to say for sure. I seem to recall reading that HW3 isn't currently able to provide proper redundancy right now, but that doesn't necessarily mean they won't be able to optimize parts of it to get it to that point. We won't really know for sure until they either A. hit a wall where they can't do something on the existing hardware (which obviously means it isn't good enough) or B. get it working reliably enough that they're ready to focus their efforts on optimizing (at which point they'll either pull it off or conclude that it isn't possible.


So do non-vision-only Teslas. This likely has more to do with incorrect path planning and lane detection/prediction than with anything hardware-related.

Bear in mind also that you're talking about the legacy highway stack. (I'm assuming you're not a Tesla employee, and thus aren't running FSD beta version 11.x.) That basically hasn't changed much at all for several years (apart from occasionally pulling in minor features like traffic light support from the FSD beta stack), because they're spending all of their development efforts on the new FSD beta stack.

When the FSD beta stack replaces the highway stack globally, those problems will probably go away. But that won't happen until FSD beta V11 goes to wide release (and probably not until they've made sure that they don't have a huge uptick in problems with that wide release).


Bearing in mind that the threshold for acceptance was almost certainly "not significantly worse than RADAR-equipped cars". Obviously that meant doing some work on the input portion of the legacy highway stack by creating their pseudo-LIDAR code to figure out the position, direction, and speed of random objects out there in the world. But that would be where they stopped — with the input code. All the path planning of the legacy highway stack is still just as primitive as it is on RADAR-equipped cars.

If the problems don't go away with the V11 unification, then those issues are totally on the AI team and the vision-only design. Up until that point — as long as you're basically just using that work as inputs into a four-year-old highway-only stack — the issues are predominantly on the legacy highway stack, and you shouldn't expect that to improve until V11 gets rolled out and unifies the driving stack.


AFAIK, they quite literally haven't been touched it since roughly the Tesla Software V9 release in 2018 (beyond integrating the vision-only front end so that they could drop the RADAR hardware). They probably could have solved the problem, but only by stealing programmer resources away from their work on the replacement stack that would eventually turn into FSD beta. I suspect if they had realized it wouldn't make it into production for four years, they might have done so, but hindsight is 20/20. :cool:
Again, nice explanation. However, Tesla has a history of overpromising and underdelivering. If history is any sign, this will not happen with existing hardware.
 
Sounds like the regulators have released documentation confirming that FSD/Autopilot were indeed active when this crash occurred, haven’t tracked down the document myself yet
I don't think FSD can be in use on I-80, but the driver probably started in FSD. If they have this though, the bigger question is did the driver do any inputs, such as turn signal to initiate lane change, or any pedal/wheel input to disengage.
 
I don't think FSD can be in use on I-80, but the driver probably started in FSD. If they have this thought, the bigger question is did the driver do any inputs, such as turn signal to initiate lane change, or any pedal/wheel input to disengage.
I’m honestly not even sure it’s beneficial for this to be the mature highway stack rather than Autosteer on City Streets

But I’m just calling it FSD at this point because it’s all-encompassing. Or we can call it FSD Capability and then the Beta program, all of them include the system switching itself between stacks and the various functions. But you order a car, pay the $15k today, you’re buying and activating FSD Capability (Beta) and it does the rest.

I would love to see the media or Tesla talk about the distinctions between Autosteer on City Streets vs TACC and NOA etc. But regardless I’m sure the NHTSA would be focusing on the human-machine interface, engagement processes, etc, they don’t expect the system to drive flawlessly in the first place.
 
  • Like
Reactions: AlanSubie4Life
Sounds like the regulators have released documentation confirming that FSD/Autopilot were indeed active when this crash occurred
It's probably the NHTSA standing general order report just updated today to include incidents up to December 15th (previously November 15th).

Most likely it's the entry for 2021 Model S with 20,697 odometer miles and ADAS engaged that was sourced via Telematics and Media. Incident was recorded at 20:39 in San Francisco, CA on a Dry Highway / Freeway with 50mph speed limit and No Unusual Conditions and Daylight with Clear Weather. The crash happened when the other Passenger Car was Proceeding Straight had contact to the Front Left, Front and Front Right while the Tesla had "Other, see Narrative" movement of 7mph with contact to its Rear Left, Rear and Rear Right resulting in no airbags deployed.

I guess the only new information is that it slowed down to 7mph when the person behind crashed into the 2021 Tesla.
 
Last edited:
But regardless I’m sure the NHTSA would be focusing on the human-machine interface, engagement processes, etc, they don’t expect the system to drive flawlessly in the first place.

Yeah, it doesn’t matter in the slightest which system was involved.

The major issue is that assuming a system was engaged, the driver did not respond appropriately and did not know how to respond appropriately based on signals from the system. That’s mostly a problem with the interface that probably needs some work. It’s a hard problem. It’s quite likely that the system wasn’t even on at the time of the collision, but the driver did not know this.

Both systems are going to routinely mess up all the time so it doesn’t matter which one it was.
 
Last edited:
Yeah, it doesn’t matter in the slightest which system was involved.

The major issue is that assuming a system was engaged, the driver did not respond appropriately and did not know how to respond appropriately based on signals from the system. That’s mostly a problem with the interface that probably needs some work. It’s a hard problem. It’s quite likely that the system wasn’t even on at the time of the collision, but the driver did not know this.

Both systems are going to routinely mess up all the time so it doesn’t matter which one it was.
Precisely, any of the modules/stacks messing up will be a surprise to absolutely no one at the NHTSA. All of these functions are Level 2, the driver needs to be attentive and ready to takeover at all times and the manufacturers of such systems need to build in sufficient equipment and processes to ensure that happens and reduces the risk to other road users.

Not only do they not expect the systems to be perfect right now, I imagine they don’t expect them to become perfect for some time. And they’ll work to ensure stuff like this doesn’t happen in the interim.
 
Yeah, it doesn’t matter in the slightest which system was involved.

The major issue is that assuming a system was engaged, the driver did not respond appropriately and did not know how to respond appropriately based on signals from the system. That’s mostly a problem with the interface that probably needs some work. It’s a hard problem. It’s quite likely that the system wasn’t even on at the time of the collision, but the driver did not know this.

Both systems are going to routinely mess up all the time so it doesn’t matter which one it was.
Hmm, in the slightest? Well, since they are different, we would want to know which one made any errors, the older or the newer. We are also interested in which one the driver thought was engaged (That's FSD) though both are to be driven with supervision. The presence of FSD in the vehicle informs as to whether it uses the radar in the vehicle, I think, and also whether the driver monitoring camera is used, does it not?

And in particular, we have the question of how many serious accidents have taken place using AP and FSD. There are many reported for AP, but surprisingly few for FSD -- I think because FSD is so poor at what it does that it enforces better driver diligence.
There may also be different phantom braking patterns for FSD and AP.

So it matters.
 
  • Like
Reactions: stopcrazypp
We should have learned by now that initial media accounts of Tesla accidents are sometimes lacking in key details. For some reason, when either a passenger train details or a commercial airliner crashes, those same media outlets wait for the preliminary reports by government regulators, before condemning either Amtrak or an airline. I wonder why?
 
  • Like
Reactions: Mullermn
The presence of FSD in the vehicle informs as to whether it uses the radar in the vehicle, I think, and also whether the driver monitoring camera is used, does it not?

Probably not for this vehicle on either count. It’s a 2021 Model S so won’t be using the radar, and will be using the monitoring, regardless. (A little ambiguity on the monitoring of course but that depends on software build, not which system was in use.)

Anyway what ends up being in use (seems almost certain to be AP but as discussed could have been an incorrect mode swap) is a curiosity but not much more.

The phantom braking and accident rates are also just curiosities of not much import. There are going to be phantom brakes and accidents!

All of that info doesn’t really make any difference to the likely actions needed to address this sort of error by the driver (presumed here to be caused by the automation).

They’re not going to be able to build a system that the driver doesn’t have to take over for! It’s not the intent so it won’t happen.
 
Again, nice explanation. However, Tesla has a history of overpromising and underdelivering. If history is any sign, this will not happen with existing hardware.
My guess is that they'll force an upgrade of the front-facing cameras at some point, but hopefully not until they have gotten far enough along that the system will tolerate us blocking the [expletive deleted] cabin camera that they would probably add at the same time.... 🙃


I’m honestly not even sure it’s beneficial for this to be the mature highway stack rather than Autosteer on City Streets
It's kind of important to know, because if the city streets beta kicked in, then the transition between stacks could be implicated, which would be interesting information in and of itself. I haven't been happy with the transition between stacks in terms of weird behavior, and I definitely hope the unified stack rolls out sooner rather than later.
 
My guess is that they'll force an upgrade of the front-facing cameras at some point, but hopefully not until they have gotten far enough along that the system will tolerate us blocking the [expletive deleted] cabin camera that they would probably add at the same time.... 🙃



It's kind of important to know, because if the city streets beta kicked in, then the transition between stacks could be implicated, which would be interesting information in and of itself. I haven't been happy with the transition between stacks in terms of weird behavior, and I definitely hope the unified stack rolls out sooner rather than later.
I definitely think there’s value in understanding what exactly happened and am sure the NHTSA is digging into the nitty gritty alongside Tesla.

In the end I would picture the regulator as having a single-minded focus towards reducing accidents period. Some of the details being questioned here make me wonder about other aspects of the whole ecosystem.

Lets say this driver accidentally hit the brake rather than the accelerator, is that a human-machine interface problem while using ADAS?

Why does an ADAS or passive features not protect against slamming on the brakes and causing a multi-vehicle pileup? Where is the car that doesn’t crash? How does this happen and what can prevent it? Where is the 360 degree Vision AI awareness of the vehicles following too closely?

It makes me wonder why we’re focusing on autosteering around city streets when a car can still borderline just flip into Park in the passing lane on a bridge and cause a pileup. I hate to say it, but FSD stack applied to this scenario will probably not be that much better and a fully engaged driver devoid of complacency is the #1 most important factor. Surely the driver can be supplemented by this technology, but it just does not bode well for a range of things.

But the NHTSA is not the enemy, Autopilot and FSD Beta continue operating today despite huge criticism lobbed at the NHTSA and NTSB. They want to allow this stuff to function/develop/innovate AND not have any accidents or undue risk to other road users, that’s the goal. And they want to come up with an implementation that properly leverages the benefits of ADAS and the human behind the wheel where each excel while compensating for each other’s shortfalls. It’s an orchestra, a symphony of human and machine that together can hopefully reduce accidents, injuries, and fatalities.



This is the likely reality for some time because a machine doing all of this is pretty far away I think. Cars right now can’t even not not stop themselves in a situation like this lol, a blatantly obvious impending pileup with people following too closely and open road ahead. Putting too much trust in this stuff is dangerous in its own right, and these people know it.
 
Last edited:
  • Like
Reactions: kabin