Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

SF Bay Accident Surveillance Footage

This site may earn commission on affiliate links.
You are arguing that it’s not called FSD…in a thread that calls it …FSD.

😂🤣

(Moderator note: This thread never “calls” anything FSD. The thread was never titled FSD, and neither was the article linked in the first post. Because of this and the fact that your comment wasn’t that funny, I hearby penalize you one of your laugh emojis, so only one laugh and one cry remain. /s

For the record, FSD does not exist. FSD Beta does exist. And you can buy or subscribe to FSD capability, so there is an argument that the capability exists even if FSD doesn’t exist yet. Even more confusingly, buying FSD Capability does not always get you into the beta.)
Krash you are 100% correct. I stand corrected.

@TresLA is arguing that it’s not called FSD…in a SUBFORUM that has “FSD” IN THE TITLE.
😅
 
Last edited:
The speed limit on the SF Bay Bridge is 50 mph.

View attachment 894470

I wonder at what speed the vehicles on the videos were going?

If the Tesla was using FSD, and if the vehicle was going faster than 50 mph,
the driver must then have been pressing the accelerator,
so can this situation be considered as Full Self Driving?

If the driver was pressing the accelerator, then could a phantom braking could occured?
I drive on that road all the time, people go 60-70mph all the time there, although the Tesla didn't appear to be going very fast in the video. Also, as others pointed out, you can set the set speed much higher than the speed limit (try it yourself if you have AP), so that does not necessarily point to accelerator application.
 
Krash you are 100% correct. I stand corrected.

@TresLA is arguing that it’s not called FSD…in a SUBFORUM that has “FSD” IN THE TITLE.
😅
My goodness how hard is it for you to understand (or just stop twisting others’ words)? The forums is called “AI, Autopilot, & Autonomous/FSD”. (Generalized) AI, Autonomous, and FSD are 3 of the 4 features/products listed that don’t exist yet. Why aren’t you complaining about this forum talking about AI and autonomous vehicles? (Hint: because it doesn’t fit your narrative.) Tesla doesn’t say they’ve solved generalized AI, nor do they purport to have any autonomous vehicles. These are (difficult) problems, future hopes, and products in development… just like FSD. The only label in this forum’s title’s list of 4 features/products that exists is Autopilot, yet another precursor on the march to FSD.

When you purchase the FSD Capability package (or subscribe monthly), you receive current/existing features in the suite that are on the march towards FSD, but isn’t FSD. Unlike subscribing monthly, purchasing FSD Capability buys that future product at a locked price, meaning you don’t have to pay anymore as Tesla increases the FSD Capability package (and eventually sells FSD one day when/if that‘s solved). Just like Elon Musk is quoted saying in your previous link, we have purchased a future product that hasn’t been delivered yet and doesn’t exist yet. It’s your choice whether or not to believe in the prediction that it’ll one day exist (during your ownership of your vehicle).

I’ve already said many times (in response to your antagonist comments and posts), that I’ve long questioned Elon Musk’s (Tesla’s) optimistic timelines. Back in 2018 when we purchased FSD Capability package on the “cheap”, I was both believing FSD was longer down the road, and also willing to pay the extra to join that journey and experience it first hand as features kept getting added and existing ones improved. Their timing may be off (laughable even), but Tesla is pushing for safer vehicles and regained commute time, in my opinion way ahead of anyone else (i.e. - when also considering cost/accessibility to mass consumers aka a better part of the world and not just an elite few). It would behoove you to stop being the class clown/bully and just throwing whatever obstacles might stick just because you can.

It‘s worth mentioning yet again: couch quarterbacking is easy, but predicting the future (and getting us there) is hard.

EDIT: I just realized a moderator is editing or adding context to some of our posts. I appreciate it and agree with the edits so I’ve gone ahead and replaced “trolling” with “antagonist” in my comment as well. 😂
 
Last edited:
You are arguing that it’s not called FSD…in a thread that calls it …FSD.

😂🤣

(Moderator note: This thread never “calls” anything FSD. The thread was never titled FSD, and neither was the article linked in the first post. Because of this and the fact that your comment wasn’t that funny, I hearby penalize you one of your laugh emojis, so only one laugh and one cry remain. /s

For the record, FSD does not exist. FSD Beta does exist. And you can buy or subscribe to FSD capability, so there is an argument that the capability exists even if FSD doesn’t exist yet. Even more confusingly, buying FSD Capability does not always get you into the beta.)
I saw this and thought, “finally! @2101Guy is getting it! (Or at least admitting it)”, but then I saw it was the moderator‘s edit/added context. Le sigh.

And yes, it is unfortunately very confusing even if it’s not illegal.
 
  • Like
Reactions: FlyF4 and RTPEV
I saw this and thought, “finally! @2101Guy is getting it! (Or at least admitting it)”, but then I saw it was the moderator‘s edit/added context. Le sigh.

And yes, it is unfortunately very confusing even if it’s not illegal.
I personally just treat the name as a marketing name. FSD is the name of their ADAS feature. Just like Kleenex is not the name of a facial tissue - but has become the defacto name for facial tissues. Or Coke is what people ask for when they mean cola. I've also noticed that other AV companies don't call their products FSD, such as Waymo and Cruise. They're "autonomous driving" or "fully autonomous driving". But that's just me. :)
 
  • Funny
Reactions: Daniel in SD
I personally just treat the name as a marketing name. FSD is the name of their ADAS feature. Just like Kleenex is not the name of a facial tissue - but has become the defacto name for facial tissues. Or Coke is what people ask for when they mean cola. I've also noticed that other AV companies don't call their products FSD, such as Waymo and Cruise. They're "autonomous driving" or "fully autonomous driving". But that's just me. :)
Understandable. It’s how much of the public views the name “FSD” as well, but it’s just not how Tesla uses the name. They’re (relatively consistently) using it to refer to the future product (i.e. - where FSD beta is hoping to get to eventually), while reserving “FSD Capability” package to refer to the suite of features beyond “basic Autopilot” including future planned feature releases on the way while developing that future FSD product. But to each their own. Hopefully, just like your Kleenex and Coke examples, people will use the “FSD” label however they want while also understanding that Tesla doesn’t use it the same way in some contexts.
 
I personally just treat the name as a marketing name. FSD is the name of their ADAS feature. Just like Kleenex is not the name of a facial tissue - but has become the defacto name for facial tissues. Or Coke is what people ask for when they mean cola. I've also noticed that other AV companies don't call their products FSD, such as Waymo and Cruise. They're "autonomous driving" or "fully autonomous driving". But that's just me. :)
Waymo actually used to use "self-driving", but they made a big deal about ditching it back in 2021:

Mercedes also did the same in the past back in 2016 when even Tesla didn't, although they haven't done it any further:
1673467664301.png

 
My goodness how hard is it for you to understand (or just stop twisting others’ words)? The forums is called “AI, Autopilot, & Autonomous/FSD”. (Generalized) AI, Autonomous, and FSD are 3 of the 4 features/products listed that don’t exist yet. Why aren’t you complaining about this forum talking about AI and autonomous vehicles? (Hint: because it doesn’t fit your narrative.) Tesla doesn’t say they’ve solved generalized AI, nor do they purport to have any autonomous vehicles. These are (difficult) problems, future hopes, and products in development… just like FSD. The only label in this forum’s title’s list of 4 features/products that exists is Autopilot, yet another precursor on the march to FSD.

When you purchase the FSD Capability package (or subscribe monthly), you receive current/existing features in the suite that are on the march towards FSD, but isn’t FSD. Unlike subscribing monthly, purchasing FSD Capability buys that future product at a locked price, meaning you don’t have to pay anymore as Tesla increases the FSD Capability package (and eventually sells FSD one day when/if that‘s solved). Just like Elon Musk is quoted saying in your previous link, we have purchased a future product that hasn’t been delivered yet and doesn’t exist yet. It’s your choice whether or not to believe in the prediction that it’ll one day exist (during your ownership of your vehicle).

I’ve already said many times (in response to your antagonist comments and posts), that I’ve long questioned Elon Musk’s (Tesla’s) optimistic timelines. Back in 2018 when we purchased FSD Capability package on the “cheap”, I was both believing FSD was longer down the road, and also willing to pay the extra to join that journey and experience it first hand as features kept getting added and existing ones improved. Their timing may be off (laughable even), but Tesla is pushing for safer vehicles and regained commute time, in my opinion way ahead of anyone else (i.e. - when also considering cost/accessibility to mass consumers aka a better part of the world and not just an elite few). It would behoove you to stop being the class clown/bully and just throwing whatever obstacles might stick just because you can.

It‘s worth mentioning yet again: couch quarterbacking is easy, but predicting the future (and getting us there) is hard.

EDIT: I just realized a moderator is editing or adding context to some of our posts. I appreciate it and agree with the edits so I’ve gone ahead and replaced “trolling” with “antagonist” in my comment as well. 😂
That’s far too much explaining.

But Tesla calls it FSD and Full Self Driving on their website.

FSD Chip​

Build AI inference chips to run our Full Self-Driving software, considering every small architectural and micro-architectural improvement while squeezing maximum silicon performance-per-watt. Perform floor-planning, timing and power analyses on the design. Write robust tests and scoreboards to verify functionality and performance. Implement drivers to program and communicate with the chip, focusing on performance optimization and redundancy. Finally, validate the silicon chip and bring it to mass production in our vehicles.”


And the chip is used in this feature you can buy

1673471298230.png
 
Last edited:
Usual Anti-Tesla Verge stuff: (a) FSD wasn't engaged since the Bay Bridge is a freeway, so this was NoA/AP. (b) The whole "70% of crashes on ADAS systems" has been shown (and admitted by the NHTSA) to be because only Tesla has a mechanism for gathering that data, other makers simply rely on owner reports, which are spotty at best. (c) Not comparing the number of accidents compared to human drivers, just presenting an absolute number as if it was shocking. Unfortunate about the accident of course, but twisting the thing the way they did is gutter reporting imho.
 
  • Like
Reactions: TresLA
It's pure semantics that even we cannot distinguish IMO; automated lane changes similar to what the car is seen doing are only available through purchasing the FSD option (and if you get super technical, the recently re-introduced EAP, unless the car is older than ~early 2019, in which case it could've also been ordered with EAP but without FSD). From the consumer perspective it is now the driver assist features offered by the FSD package that appear to have malfunctioned.

Also outside the Autopilot team no one can distinguish whether the behavior the car demonstrated was caused by any code or neural networks that is considered “FSD” even by the definitions mentioned in some of these threads - for all we know by now a bunch of these things are already shared.
 
It's pure semantics that even we cannot distinguish IMO; automated lane changes similar to what the car is seen doing are only available through purchasing the FSD option (and if you get super technical, the recently re-introduced EAP, unless the car is older than ~early 2019, in which case it could've also been ordered with EAP but without FSD). From the consumer perspective it is now the driver assist features offered by the FSD package that appear to have malfunctioned.

Also outside the Autopilot team no one can distinguish whether the behavior the car demonstrated was caused by any code or neural networks that is considered “FSD” even by the definitions mentioned in some of these threads - for all we know by now a bunch of these things are already shared.
The biggest benefit of single stack is that we'll never have to see "actually that is NoA (FSD capability!) not FSD" posts again.
 
I don't think the general public would get much out of discussing this in terms of modules, heck most people who know the technology still talk about "FSD Beta" rather than Autosteer on City Streets.

In general IMO it's not inaccurate to say that this was FSD, because FSD takes the AP + EAP modules and then adds Traffic and Stop Sign control plus Autosteer on City Streets in the Beta program. Preciseness would be saying this wasn't City Streets, but FSD is the whole kitten caboodle including its ability to switch between stacks depending on context.


Don't know if I've ever even heard/seen Elon or anyone at Tesla utter/type the words Autosteer on City Streets
 
  • Like
Reactions: 2101Guy
The author, @bradtem, is active on TMC btw.
I am here, especially when you Voldemort-invoke me by using the userid in a way that tags the thread. There have been many AP crashes of course, but this one was interesting because it was one of the few where I felt an ordinary driver, even used to AP, might run into some trouble, the combination of an automatic lane change from Nav-on-AP and a phantom brake at the same time. Unlike the typical mistakes AP makes, where you do things like grab the wheel and hit brakes, phantom braking you intervene by pressing the accelerator -- which does not disengage AP or TACC. It's not a natural reaction to do and not something you ever want to do without thinking. Which makes it harder.

Clearly this guy should have said, "Hey that's strange I am lane changing and braking" and intervened. But I can also see it surprising a typical driver into inaction. Of course we don't know if he's just lying, and wasn't paying attention -- but for a lawyer to lie on a police statement can get you disbarred, so he's a bit less likely to say other than what he at least thinks is the truth. I understand why he thinks he was in FSD, that is presumably what he activated before getting on the highway, not everybody knows it switches on the highway to AP.

Mostly, Tesla drivers do a decent job of supervising when they are putting their attention on it. Of course some don't put their attention on it, even put in defeat devices. But this is a case where Tesla could improve the software to make a mistake here less likely.

Its's also quite odd that the system decided to lane change here. He wasn't gummed up by traffic ahead of him, though I suspect he had AP set to travel well above the limit making it like the left lane. A phantom braking in a tunnel doesn't surprise me as much. While the 2021 model S has radar, I believe if you have FSD you are on full Tesla vision, ignoring your radar. Is that true even in AP?
 
Unlike the typical mistakes AP makes, where you do things like grab the wheel and hit brakes, phantom braking you intervene by pressing the accelerator -- which does not disengage AP or TACC. It's not a natural reaction to do and not something you ever want to do without thinking. Which makes it harder.

As has been mentioned, the best hypothesis I have heard is that the driver hit the brake rather than the accelerator, to help with merging into traffic during the AP-initiated (or user-initiated) lane change. That would explain the incomplete lane change but would have to study the precise signal-canceling behavior of AP in the presence of brake disengagement concurrent with lane change to see if it all fits.

We’ll never know without data from the car of course.

We don’t know how NOA was configured, etc. So no way to know why the car changed lanes.
 
As has been mentioned, the best hypothesis I have heard is that the driver hit the brake rather than the accelerator, to help with merging into traffic during the AP-initiated (or user-initiated) lane change. That would explain the incomplete lane change but would have to study the precise signal-canceling behavior of AP in the presence of brake disengagement concurrent with lane change to see if it all fits.

We’ll never know without data from the car of course.

We don’t know how NOA was configured, etc. So no way to know why the car changed lanes.
Tesla used to be fairly open about this. Particularly if it puts more blame on the driver. So if this driver hit the brakes, disengaging AP, Tesla might be happy to point that out. But they have not. However, this is still not such a simple issue. If Tesla's UI is such that people do get confused in that situation and sometimes disengage without being aware, that may be something they can improve. Of course it makes beeps.

The lawyer driving this car made a statement to police that FSD was on and malfunctioned. Of course, if he was unaware he had hit the brake this could be an honest statement. If this was a user initiated lane change, he could be in trouble.

The big question that we all need to answer, though is, "is it safe to have tools like autopilot with driver monitoring." Tesla publishes deliberately misleading numbers to say that it's very safe, but it bothers me that they do this when they know the true numbers -- why do they not just tell us if those numbers are good? Why play games to make it look better than it is, if what it is is still good? They don't answer, of course.

The question of "is it safe" involves both how well it does with good diligent drivers, how it is with negligent drivers and whether that can be reduced with driver monitoring, and also, are there situations where even a good driver will screw up in a way that wouldn't happen in manual. This story might be -- might, not definitely -- an example of that.
 
Tesla publishes deliberately misleading numbers to say that it's very safe, but it bothers me that they do this when they know the true numbers -- why do they not just tell us if those numbers are good?
While I agree that Tesla publishes misleading numbers, calculating the true numbers seems very difficult. Even if they were to correct for where AP is used there are so many other variables. You need to also correct for when people use AP. Do people use it more or less at night? Do people use it more or less in bad weather? What if there is a difference between drivers who use AP often and those that don't?

I'd like to see the TACC only numbers. I bet those are the best.
 
  • Like
Reactions: bradtem