Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

SF Bay Accident Surveillance Footage

This site may earn commission on affiliate links.
Hmm, in the slightest? Well, since they are different, we would want to know which one made any errors, the older or the newer. We are also interested in which one the driver thought was engaged (That's FSD) though both are to be driven with supervision. The presence of FSD in the vehicle informs as to whether it uses the radar in the vehicle, I think, and also whether the driver monitoring camera is used, does it not?

And in particular, we have the question of how many serious accidents have taken place using AP and FSD. There are many reported for AP, but surprisingly few for FSD -- I think because FSD is so poor at what it does that it enforces better driver diligence.
There may also be different phantom braking patterns for FSD and AP.

So it matters.
Beat me to it. I don't see why it doesn't matter which system was engaged. It absolutely matters IMO given the two systems work very differently. It also matters if this was a result of a system change (for example the theory that this was because of switching to FSD Beta mode and the car nav having the wrong GPS location).
 
It absolutely matters IMO given the two systems work very differently. It also matters if this was a result of a system change (for example the theory that this was because of switching to FSD Beta mode and the car nav having the wrong GPS location).

Why does it matter that they work differently?

Obviously one system might have been fine and the other not, but why does that matter?

Regarding the system change - why does this matter?

Of course it matters at some level if FSD was on because it thought it was in the wrong place. They should fix that if so! But it’s not the reason for the accident even if it turns out that is exactly what happened! You could make that perfect and never happen and accidents like this will still happen. How are you going to prevent a driver from hitting the brake pedal inadvertently, for example (with Joe Mode on and tunes engaged with volume turned to 11) ?

This GPS possibility is like an airspeed pitot tube getting iced up. It happens. Not a big deal.

These are just contributing factors and obviously you are not going to just magically fix all of these things.

You need to fix something which is more well defined in the meantime (if that other thing ended up being the main cause - but need to continue to be vigilant about it regardless).

Anyway we’ll see.

One thing they could consider adding is a blaring rear collision alert. However, there is likely a very good reason they do not have that - it is perhaps hard to communicate to the driver fast enough without using the screens and separating it from other problems. Also the hardware might not be good enough to avoid false positives. Just going off of FCWs… 😂
 
Last edited:
Why does it matter that they work differently?

Obviously one system might have been fine and the other not, but why does that matter?

Regarding the system change - why does this matter?

Of course it matters at some level if FSD was on because it thought it was in the wrong place. They should fix that if so! But it’s not the reason for the accident even if it turns out that is exactly what happened! You could make that perfect and never happen and accidents like this will still happen. How are you going to prevent a driver from hitting the brake pedal inadvertently, for example (with Joe Mode on and tunes engaged with volume turned to 11) ?

This GPS possibility is like an airspeed pitot tube getting iced up. It happens. Not a big deal.

These are just contributing factors and obviously you are not going to just magically fix all of these things.

You need to fix something which is more well defined in the meantime (if that other thing ended up being the main cause - but need to continue to be vigilant about it regardless).

Anyway we’ll see.

One thing they could consider adding is a blaring rear collision alert. However, there is likely a very good reason they do not have that - it is perhaps hard to communicate to the driver fast enough without using the screens and separating it from other problems. Also the hardware might not be good enough to avoid false positives. Just going off of FCWs… 😂
To determine the factors leading up to the accident I think it does matter. Also matters in terms of if Tesla should do a merging of the stacks (or if further testing is needed).
 
  • Like
Reactions: sleepydoc
Why does an ADAS or passive features not protect against slamming on the brakes and causing a multi-vehicle pileup?
There’s a big difference between a car automatically braking and automatically accelerating.

Braking is always ‘safe’ because it’s the responsibility of the driver behind not to hit the car in front (every one of the drivers that piled in to the back of this pile up is at fault).

Sudden aggressive acceleration (or interfering in a driver’s instruction to brake) is another matter. You’d need your detection of objects in front to be absolutely faultless so there’s no risk of the ADAS causing a crash while trying to avoid one.

More practically, unlike the accelerator the brakes are (I believe) mechanical in operation, albeit with power assistance. I’m not sure it’s even possible for the car to prevent the driver braking.

Cars right now can’t even not not stop themselves in a situation like this lol, a blatantly obvious impending pileup with people following too closely and open road ahead.
Modern cars can, but most cars on the road are at least a few years old and active cruise control and associated safety features have only been in the mid to high end price brackets until recently. If all of those cars were Teslas (or equivalent from other manufacturers) the crash wouldn’t have happened.
 
There’s a big difference between a car automatically braking and automatically accelerating.

Braking is always ‘safe’ because it’s the responsibility of the driver behind not to hit the car in front (every one of the drivers that piled in to the back of this pile up is at fault).

Sudden aggressive acceleration (or interfering in a driver’s instruction to brake) is another matter. You’d need your detection of objects in front to be absolutely faultless so there’s no risk of the ADAS causing a crash while trying to avoid one.

More practically, unlike the accelerator the brakes are (I believe) mechanical in operation, albeit with power assistance. I’m not sure it’s even possible for the car to prevent the driver braking.


Modern cars can, but most cars on the road are at least a few years old and active cruise control and associated safety features have only been in the mid to high end price brackets until recently. If all of those cars were Teslas (or equivalent from other manufacturers) the crash wouldn’t have happened.
It all really just makes me think about how far away this is, even a car that is merely "smart" enough to not let something like this happen much less Generalized Level 4/5 autonomy where manufacturers are taking ownership of the DDT and liability for accidents that occur.

If Tesla owned the liability in this case, which they don't because it's a Level 2 ADAS, it would likely not be a good situation for the company even knowing the vehicles behind were following too closely. Just imagine if a global corporation owned the liability for millions of vehicles operating nationally and beyond.

Humans will remain in control for the foreseeable future I think, we need technology that augments and makes up for our weaknesses while not introducing new gaps.
 
A few points considering that autopilot confirmed was engaged:
  • GPS error. It's in a double stacked highway/bridge, gps can and does mess up in those scenarios. I have it happen to myself on the GWB when I'm driving on the lower level (happens in tunnels as well). It reroutes all the time when I drive across the gwb, it tells me I'm fairly far off of my course etc. Car could react thinking it's on a street at a different location

  • Map data error.
  • I drive along the BQE section of I278, there are certain places where the fsd beta stack will pop up. Screen changes from the normal auto pilot screen to the fsd beta screen used on streets. I know it like clockwork. GPS is accurate in this scenario, but for some reason fsd beta kicks in. This could be the case as well in SF
At the end of the day Vision only has been a step backward in my experience, hoping v11 corrects a lot. My gut feeling is Tesla may be relying too heavily on too few sensors with no redundancy (I would miss USS in my m3). I've test driven the Lucid Air and MB EQS, they've caught up to autopilot on the highway and have exceeded Tesla (in MB's case), I want to see what Tesla does from here.
 
Last edited:
It all really just makes me think about how far away this is, even a car that is merely "smart" enough to not let something like this happen much less Generalized Level 4/5 autonomy where manufacturers are taking ownership of the DDT and liability for accidents that occur.

If Tesla owned the liability in this case, which they don't because it's a Level 2 ADAS, it would likely not be a good situation for the company even knowing the vehicles behind were following too closely. Just imagine if a global corporation owned the liability for millions of vehicles operating nationally and beyond.

Humans will remain in control for the foreseeable future I think, we need technology that augments and makes up for our weaknesses while not introducing new gaps.
Are humans really in control? What is the minimum reaction speed that is required when the car does the wrong thing at the worst time? Is it humanly possible in all cases? What driver ed courses teach rapid engagement of the accelerator when the car decides its time to do an E-stop? Do they expect that you have a foot over each pedal?
 
What driver ed courses teach rapid engagement of the accelerator when the car decides its time to do an E-stop? Do they expect that you have a foot over each pedal?
In this case the driver had plenty of time to react to the car and make necessary corrections. Split second is all it takes for me to either react to wrong steering or occasional phantom braking. I’m paying full attention and not on the phone or doing non driving activities when in auto pilot. Still my commutes are infinitely more relaxing and tesla does a pretty damn good job of keeping the car in lane. Unlike the human driven cars that i share the road with.
 
  • Like
Reactions: EVNow
Are humans really in control? What is the minimum reaction speed that is required when the car does the wrong thing at the worst time? Is it humanly possible in all cases? What driver ed courses teach rapid engagement of the accelerator when the car decides its time to do an E-stop? Do they expect that you have a foot over each pedal?
Where were your feet when you used old-school cruise control? Probably on the floor just next to the pedals. When you were driving normally, and a dog ran into the street, or a bicyclist swerved into the road, how long did it take you to move your foot off the accelerator and onto the brake? To answer your overly-dramatic question: no, it's not humanly possible in all cases. What many people are failing to understand is that the system is not designed to be perfect, nor will it ever be perfect. It's not designed to prevent all accidents. It will cause accidents. It will cause people to get hurt. It will kill people. The goal is that it will cause LESS of these things than humans have previously. Humans killed nearly 43 thousand people in car crashes in 2021. If ADAS features can save some of those people, it's worth it.
 
Are humans really in control? What is the minimum reaction speed that is required when the car does the wrong thing at the worst time? Is it humanly possible in all cases? What driver ed courses teach rapid engagement of the accelerator when the car decides its time to do an E-stop? Do they expect that you have a foot over each pedal?
Sounds like a good spot to supplement with technology to offset our weaknesses.

Replacing human drivers with technology, however, is nowhere even close to reality IMO in generalized terms.
 
It's probably the NHTSA standing general order report just updated today to include incidents up to December 15th (previously November 15th)… I guess the only new information is that it slowed down to 7mph when the person behind crashed into the 2021 Tesla.
This CNN article from Tuesday also mentions "7 mph" so it seems likely this standing general order updated on Tuesday was what triggered articles to be written two days ago:

At the very bottom of the article: "This story has been updated to reflect that a driver-assist system was active within 30 seconds of the crash." Which again matches up with the NHTSA data requirement:
Level 2 ADAS: Entities named in the General Order must report a crash if Level 2 ADAS was in use at any time within 30 seconds of the crash and the crash involved a vulnerable road user or resulted in a fatality, a vehicle tow-away, an air bag deployment, or any individual being transported to a hospital for medical treatment.​
So some places are saying Autopilot and/or FSD Beta were definitely active when the crash happened, but as CNN corrected, given that the surveillance footage from the original post were 22 and 26 seconds long with crash happening at 00:07 in the videos, Autopilot could have been disengaged well before the tunnel but still required the NHTSA report.
 
So some places are saying Autopilot and/or FSD Beta were definitely active when the crash happened, but as CNN corrected, given that the surveillance footage from the original post were 22 and 26 seconds long with crash happening at 00:07 in the videos, Autopilot could have been disengaged well before the tunnel but still required the NHTSA report.
I won’t be surprised if the driver had disengaged auto pilot before changing lane - but panicked for some reason and slammed on the brakes (brakes instead of accelerator) ?
 
I won’t be surprised if the driver had disengaged auto pilot before changing lane - but panicked for some reason and slammed on the brakes (brakes instead of accelerator) ?
I would be a little surprised. I think the lane change was automated (could have been user initiated or automatic) and there was phantom braking. Maybe driver did not intervene or maybe they hit the brake for some reason (red wheel of death? Thought car must be braking for a good reason?)
 
I would be a little surprised. I think the lane change was automated (could have been user initiated or automatic) and there was phantom braking. Maybe driver did not intervene or maybe they hit the brake for some reason (red wheel of death? Thought car must be braking for a good reason?)
Didn't someone say the # of blinks shows it wasn't AP changing ... ?

I think it is possible that when changing the lane forward collision warning honked to alert the driver that he is heading towards the wall ... and that caused the panic braking ?

ps : In general it is a bad idea to change lanes within a tunnel. IIRC it is not allowed (or atleast adviced in many states).
 
NHTSA confirms FSD beta was engaged and may have caused accident:

A Tesla Model S that braked sharply, triggering an eight-vehicle crash on I-80 in San Francisco last November, had the EV maker’s FSD Beta engaged seconds before the crash. Information was confirmed by data the federal government released Tuesday.

 
Last edited:
NHTSA confirms FSD beta was engaged and may have caused accident:



They are probably referring to the accident reporting data that Tesla has to report. Nothing from the NHTSA investigation. And all that says is that assisted driving features were active within 30 seconds of the accident. So they could have been on up to the accident or turned off before any of the video footage we have.

Of course the article just mentions that that is what CNN reported, but no link to that report to look at the details.
 
They are probably referring to the accident reporting data that Tesla has to report. Nothing from the NHTSA investigation. And all that says is that assisted driving features were active within 30 seconds of the accident. So they could have been on up to the accident or turned off before any of the video footage we have.

Of course the article just mentions that that is what CNN reported, but no link to that report to look at the details.
Just to be devil's advocate in a general case - not a comment on this particular accident. Just because AP was active in a crash up to 30 seconds or 5 seconds before the crash it may not mean AP is to blame. Can you imagine a driver on AP, turning off AP and accidentally swerving themselves into a guardrail. Not AP's fault at all. Even if AP turned itself off, the driver may still be the actual and sole cause of the accident.

I don't know if the car logs are sufficient to clearly show enough telemetry parameters to determine what happened. Possibly in a Tesla if they record steering, braking etc with a fast enough resolution. If it's once every few seconds or some parameters aren't stored then it may be less useful. Not sure if other brands record precise telemetry either.
 
They are probably referring to the accident reporting data that Tesla has to report. Nothing from the NHTSA investigation. And all that says is that assisted driving features were active within 30 seconds of the accident. So they could have been on up to the accident or turned off before any of the video footage we have.

Of course the article just mentions that that is what CNN reported, but no link to that report to look at the details.
Yep, another one posted another report that quotes the CNN one that mentioned the same thing (CNN report has been updated to clarify also the 30 seconds, so that's compounding errors). That's why I hate a lot of third party sources, they add their own embellishments when they are too ignorant to know the details.