Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

SF Bay Accident Surveillance Footage

This site may earn commission on affiliate links.
Great points.

SURELY Tesla has logged a TON of data from this accident that would answer a LOT of questions...like they have before when they have posted log data to refute claims in prior accidents where drivers claimed FSD was active.

The only action that Tesla has seemed to have taken in this incident thus far is Elon's censorship on the matter
People need to learn how to search on Twitter. It is extremely easy to find all of Ken’s Tweets.

(Regular search is broken by design so you should never use that anyway.)
 
People need to learn how to search on Twitter. It is extremely easy to find all of Ken’s Tweets.

(Regular search is broken by design so you should never use that anyway.)
Or, perhaps it was simply Twitter suffering from various outages and impacts due to Elon's lack of forethought before firing so many key persons, then having to ask for them back...


 
  • Funny
Reactions: AlanSubie4Life
Or, perhaps it was simply Twitter suffering from various outages and impacts due to Elon's lack of forethought before firing so many key persons, then having to ask for them back...



No, Twitter Search has been broken for years. Best not to use it.
 
I have a theory, Autopilot could have phantom braked while changing lanes (the signal is on) and the drive could have further hit the brakes. Some drivers reaction is when something is out of sorts is to hit the brakes. It's not the right reaction for all circumstances, but panicked drivers might not react appropriately. This could be the driver's first encounter with phantom braking/misbehaving and didn't know how to react (not every tesla owner is on this forum and well advised on this, like it was in 2012).

Now from my own experience; I regularly disconnect autopilot or fsd the second I see construction, any uncontrolled left turn, fully stopped traffic on the highway (moving fast to a complete stop), jerks in stop and go traffic, or any non perfect exit from the highway. It's not worth the risk in my opinion. There are drivers that are far more trusting than I am, that let the system get itself into trouble.
 
I have a theory, Autopilot could have phantom braked while changing lanes (the signal is on) and the drive could have further hit the brakes. Some drivers reaction is when something is out of sorts is to hit the brakes. It's not the right reaction for all circumstances, but panicked drivers might not react appropriately. This could be the driver's first encounter with phantom braking/misbehaving and didn't know how to react (not every tesla owner is on this forum and well advised on this, like it was in 2012).

In general someone will press on the brake pedal when there is an obstacle in front of the car, but it was not the case.

If the driver decided to press on the brake pedal, then the autopilot get disconnected.
 
  • Helpful
Reactions: FlyF4
I don't think navigation would be responsible. Navigation chooses where the car wants to go
I'm giving an example of something user visible of the GPS being inaccurate. People who have used FSD Beta a lot probably are aware of odd behavior situations that match up with cases where navigation was routing the wrong direction or believed it was on a different street. The underlying issue would be caused by the car thinking it was somewhere else.

Yes I know Autopilot drives based on what it sees, but there are aspects that rely on position. One of them is determining whether or not the vehicle is on a controlled access highway to allow Navigate on Autopilot vs plain Autopilot vs FSD Beta. Another aspect is using map data to prepare for upcoming exits/merges/intersections and traffic controls.

I'm pointing out that in the nominal case of taking the Treasure Island exit and switching to city streets is something that needs 10ft GPS accuracy. Has anybody had situations where their GPS was inaccurate by that much?
 
I have a theory, Autopilot could have phantom braked while changing lanes (the signal is on) and the drive could have further hit the brakes. Some drivers reaction is when something is out of sorts is to hit the brakes. It's not the right reaction for all circumstances, but panicked drivers might not react appropriately. This could be the driver's first encounter with phantom braking/misbehaving and didn't know how to react (not every tesla owner is on this forum and well advised on this, like it was in 2012).

Now from my own experience; I regularly disconnect autopilot or fsd the second I see construction, any uncontrolled left turn, fully stopped traffic on the highway (moving fast to a complete stop), jerks in stop and go traffic, or any non perfect exit from the highway. It's not worth the risk in my opinion. There are drivers that are far more trusting than I am, that let the system get itself into trouble.
The fact that you are having to do this in 2023...cant be what folks expected after hearing this last year. Cant be...

"Full Self-Driving. So, over time, we think Full Self-Driving will become the most important source of profitability for Tesla. It’s — actually, if you run the numbers on robotaxis, it’s kind of nutty — it’s nutty good from a financial standpoint. And I think we are completely confident at this point that it will be achieved. And my personal guess is that we’ll achieve Full Self-Driving this year, yes, with data safety level significantly greater than present."


Also: Note what it's called repeatedly in that statement. Repeatedly...
 
  • Like
Reactions: ljbad4life
Looking at the rear view footage camera, the car started changing lane AFTER the stop lights were on,
so it seems that the driver who made the lane change, not the autopilot?

Other hypothesis, the driver wanted to change lane but received a vibration on the steering wheel
to indicate that a car on the left lane was approching? And the driver was surprised, and apply on the barkes?

SF - Rear View 01 .jpg


SF - Rear View 02 .jpg


SF - Rear View 03 .jpg
 
Last edited:
  • Like
Reactions: cpaull
Looking at the rear view footage camera, the car started changing lane AFTER the stop lights were on,
so it seems that the driver who made the lane change, not the autopilot?

Other hypothesis, the driver wanted to change lane but received a vibration on the steering wheel
to indicate that a car on the left lane was approching? And the driver was surprised, and apply on the barkes?

View attachment 894900

View attachment 894903

View attachment 894904

Car direction might have been changed enough by AP prior to disengagement that it still ended up just over the lines in the other lane when car came to a halt. Would have to look more closely.
 
Looking at the rear view footage camera, the car started changing lane AFTER the stop lights were on,
so it seems that the driver who made the lane change, not the autopilot?

- May be the driver realized that he missed the exit on the left?

View attachment 894900

View attachment 894903

Thinking back I've had this happen during a lane change where autopilot will slow down not this dramatically, in a similar situation with an overpass. The shadows could make the car think there's something either in the lane or in front of it. This looks like a brand new model s, I'm going to get controversial for a moment.

I've test driven a new model s, (I hate the Yoke, hate it. Bring back the stalk for switching gears) something I noted during my drive is, it's far less intuitive to engage and disengage autopilot beyond using the brake pedal to turn it off. Capacitive buttons that do not have a shape etched into the material is poor design. This less than ideal design is overcome with familiarity.

A new driver (to the model s) during a panic while the autopilot system is doing something unexpected may jam on the brakes to stop the system because they may not take the time to look at the yoke to determine which button to press. During my test drive I did disengage autopilot with a tap to the brakes, you and I know this but a new to tesla drive might assume a full brake pedal is needed. The driver is still at fault, but given the system and its aberrations every so often.. I wouldn't lay 100% of the blame with the driver, 95% sure.
 
Last edited:
Looking at the rear view footage camera, the car started changing lane AFTER the stop lights were on,
so it seems that the driver who made the lane change, not the autopilot?

Other hypothesis, the driver wanted to change lane but received a vibration on the steering wheel
to indicate that a car on the left lane was approching? And the driver was surprised, and apply on the barkes?

View attachment 894900

View attachment 894903

View attachment 894904
Query: where can one find that camera's video?
 
Ignore specifics about the technology and what not, essentially what I am seeing is a car moving fast into a dramatically different driving environment. Specifically it is going from bright sunlit open lanes, to basically a dark tunnel.

Good automation would be cautious. But that's counter to how people actually behave.

I know personally when I am driving and I approach something like this at speed I hold my breath and proceed basically ignoring risks (it always feels like a good glutes strengthening exercise!).

So I think what I am seeing is a a difference in behavior. And I think we see evidence of that by just how many cars continue to crash into the vehicles in the tunnel. (for the record, the Tesla didn't crash, the other cars behind it did. I do wonder how many of those cars had any safety/driver assistance features).
 
It looks like he might have not responded to the Auto Pilot nags and the car was simply pulling itself over and stopping as its supposed to ? That seems like the simplest explanation.

Regardless, there is something wrong with the driver. He could have just corrected and gotten on the gas to avoid the entire situation.

I have not read this thread or other articles (just watched the video from the rear angle, so this is admittedly just a hot take.
 
The fact that you are having to do this in 2023...cant be what folks expected after hearing this last year. Cant be...

"Full Self-Driving. So, over time, we think Full Self-Driving will become the most important source of profitability for Tesla. It’s — actually, if you run the numbers on robotaxis, it’s kind of nutty — it’s nutty good from a financial standpoint. And I think we are completely confident at this point that it will be achieved. And my personal guess is that we’ll achieve Full Self-Driving this year, yes, with data safety level significantly greater than present."


Also: Note what it's called repeatedly in that statement. Repeatedly...
I Agree. Vision only has been a huge step back and there needs to be redundancy with a non vision based system when vision fails. The rumor being that Tesla is bringing back Radar and USS to Tesla's sometime this year, it's clearly the case that vision only has it's limitations.
 
  • Like
Reactions: 2101Guy
Ignore specifics about the technology and what not, essentially what I am seeing is a car moving fast into a dramatically different driving environment. Specifically it is going from bright sunlit open lanes, to basically a dark tunnel.

Good automation would be cautious. But that's counter to how people actually behave.

I know personally when I am driving and I approach something like this at speed I hold my breath and proceed basically ignoring risks (it always feels like a good glutes strengthening exercise!).

So I think what I am seeing is a a difference in behavior. And I think we see evidence of that by just how many cars continue to crash into the vehicles in the tunnel. (for the record, the Tesla didn't crash, the other cars behind it did. I do wonder how many of those cars had any safety/driver assistance features).
The system should take into account that the other cars are behaving normative and continue to function that way. It's the only vehicle panic braking. Poor system design mixed with poor driver choices leads to accidents.
 
  • Like
Reactions: mongo
Car direction might have been changed enough by AP prior to disengagement that it still ended up just over the lines in the other lane when car came to a halt. Would have to look more closely.

When extracting the front view images using the same timing and position of the 3 cars to see when the Tesla had the brake light on.
You can see that the Tesla was still in its own lane when the brake lights were on.

- So does the autopilot cancel the lane change because a car on the left lane was approaching?
If so, I don't see any reason why the autopilot would had also activated the brake.

- It seems that the driver is the one who may have pressed on the brake.
But why changing lane manually while still pressing the brake pedal at the same time?



SF - Front View 05 .jpg


SF - Front View 06 .jpg


SF - Front View 07 .jpg
 
Last edited:
  • Like
Reactions: Z_Lynx
It looks like he might have not responded to the Auto Pilot nags and the car was simply pulling itself over and stopping as its supposed to ? That seems like the simplest explanation.
I thought Tesla's documentation says in a case where the car has to do that (due to driver ignoring all nags), it does so when/where it is safe to do so?

Im thinking coming to a complete stop in an active traffic lane isnt what most would consider a safe place to come to a stop?
 
It looks like he might have not responded to the Auto Pilot nags and the car was simply pulling itself over and stopping as its supposed to ? That seems like the simplest explanation.

Regardless, there is something wrong with the driver. He could have just corrected and gotten on the gas to avoid the entire situation.

I have not read this thread or other articles (just watched the video from the rear angle, so this is admittedly just a hot take.
What's strange is that it's the Bay Area, and so thousands of Teslas have gone through this same path, and likely a bunch of them on AP. What's different here?