Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
The thing is, all of those apply or have applied to Teslas in some form but the majority of the complaints are with FSDb, and as some people here seem to forget, the 'b' stands for beta meaning it's software that is known to be incomplete and buggy and is not expected to be perfect. The acceleration is just fine if I do it myself (and honestly is just fine with the latest version of FSDb,) and I've never had it change lanes for no reason nor has it run over any children when I'm driving so I don't know what other people's problem is!

Some of them are simply things that Tesla does not do well - the auto wipers are generally agreed to be bad in Teslas, and phantom braking has clearly been an issue for many people so those are legitimate complaints IMO.

Edit: I've also had plenty of cars that had an A/C that stank - that's an A/C maintenance issue common to many cars.
"Beta" classification is misused and abused by Telsa - Defects rates fit more a pre-alpha classification. Also, many Telsa non FSD beta features have been in the "Beta" state" for nearly a decade. Fact.
 
Here's another case of an unsafe inconsiderate FSDj action. This is the new v11.4.3 rollout.

Watching the screen for how close the car came to being hit was interesting. In this case traffic behind was only shown once the car was slowing dramatically in the straight through lane.

While I was driving (on TACC, not FSDb) I braked for an amber light and my passenger (who was recently rear-ended in the same situation) asked it the car would not allow me to brake if I would be hit by the driver behind me.

I don't know the answer (I assume it would let me be hit rather than run a now red light even though currently FSDb happily runs reds for no reason) but then we discussed how the Ego's representation on the screen doesn't tell me about traffic behind me which is actually really important. I happened to 'know' due to scanning mirrors and speed of road that no-one was on my bumper in the seconds before I used the brakes to stop. But I did not specifically look before hitting the brakes. It was one of those tough call lights but it was also a red-light camera light so making the wrong call would have had a literal cost and I was going the speed limit (it was also a road with speed cameras) and judged I had to speed up to get through on the amber so chose to stop instead.
 
  • Like
Reactions: kabin
"Beta" classification is misused and abused by Telsa - Defects rates fit more a pre-alpha classification. Also, many Telsa non FSD beta features have been in the "Beta" state" for nearly a decade. Fact.
On one hand, you state beta is more of an alpha or pre-– alpha software, because of all the defects and then in the next sentence, you complain that Tesla use is beta for software that’s stable and usable. So which is it? Have you ever been in one of the beta programs for iOS? System crashes and data loss are not at all unheard of. Every single person who signs up for FSD beta knows that it’s beta, has been told that it’s beta, and even clicks, except after reading a statement from Tesla saying it can do the worst thing at the worst possible time. If you are using the software, you know this, too, yet you continually complain about the fact that the software is not perfect and call it junk because it’s not perfect. (moderator edit)
 
Last edited by a moderator:
I agree it was a close call, but when I look at the speedo, the slowdown was gradual. It didn't slam on the brakes. Its only "mistake" is that it initially stopped too far behind the lead car, but there very well may have been another car there in another instance.
I second by second played the video.

25:41 - 35mph (speed limit)
25:42 - 34
25:43 - 33
25:44 - 31
25:45 - 28
25:46 - 25, then 21 and turn signal comes on at entrance to turn lane
25:47 - 16 Not fully in turn lane
25:48 - 11 Not fully in turn lane
25:49 - 9 Near collision, then 7 and finally fully in turn lane
25:50 - 4 at end of turn lane
25:51 - 0 at end of turn lane
starts creeping
NINE seconds later....
26:00 - 0 behind lead car in turn lane.

So the car decelerated 10 mph before its turn signal came on, and then came close to a full stop before turning into the turn lane. Turn signal to near collision was 3 sec (so the car that nearly hit it was NOT tailgating) and anyone following would never expect the car to come to an almost complete stop, 26mph below the speed limit, when there was room for the car to leave the thru-lane.

Cars further behind would have also had to slow as dramatically for the erratic behaviour of the Tesla, as the brake-checking chain reaction played out.

The near collision was 100% caused by the unexpected behaviour of the tesla. The tesla caused the situation and it was the actions of the drivers behind, swerving around the obstacle unnecessarily blocking part of their lane that robbed a body shop of some work.

To compound all this misbehaviour, it then took another 11 seconds before anyone else could enter the turn lane without blocking the straight thru lane (which was a green light so anyone after the Tesla wanting to turn left would have been blocking thru-traffic.)

Sure, had the turn lane had other cars in it the tesla would have been sitting there, but all traffic would be aware of why no more cars could fit in the turn lane. No-one could reasonably expect or comprehend why the tesla basically parked the car at the end of the turn lane.
 
Watching the screen for how close the car came to being hit was interesting. In this case traffic behind was only shown once the car was slowing dramatically in the straight through lane.

While I was driving (on TACC, not FSDb) I braked for an amber light and my passenger (who was recently rear-ended in the same situation) asked it the car would not allow me to brake if I would be hit by the driver behind me.

I don't know the answer (I assume it would let me be hit rather than run a now red light even though currently FSDb happily runs reds for no reason) but then we discussed how the Ego's representation on the screen doesn't tell me about traffic behind me which is actually really important. I happened to 'know' due to scanning mirrors and speed of road that no-one was on my bumper in the seconds before I used the brakes to stop. But I did not specifically look before hitting the brakes. It was one of those tough call lights but it was also a red-light camera light so making the wrong call would have had a literal cost and I was going the speed limit (it was also a road with speed cameras) and judged I had to speed up to get through on the amber so chose to stop instead.
In general I’ve found the braking behavior for yellow lights in V11.3.6+ to be very appropriate. I’ve noticed the representation of surrounding traffic on the screen is not terribly accurate and part of driving is monitoring your mirrors to know what the surrounding traffic is doing. (Like you were doing.) If the yellow light was a close call for you then any car following you should definitely have been expecting/starting to brake anyway and your braking should not have been a surprise.
 
  • Like
Reactions: EVNow
On one hand, you state beta is more of an alpha or pre-– alpha software, because of all the defects and then in the next sentence, you complain that Tesla use is beta for software that’s stable and usable. So which is it? Have you ever been in one of the beta programs for iOS? System crashes and data loss are not at all unheard of. Every single person who signs up for FSD beta knows that it’s beta, has been told that it’s beta, and even clicks, except after reading a statement from Tesla saying it can do the worst thing at the worst possible time. If you are using the software, you know this, too, yet you continually complain about the fact that the software is not perfect and call it junk because it’s not perfect. (moderator edit)
Or.. And I hate to say this, but I'm a refugee from the old Tesla.com forums. A bunch of us over there had come to the conclusion that there were paid trolls hanging out: One user name, for example, with multiple writing styles that would post 24/7, or close to it. Some of whom, it was discovered, posted on other EV forums attempting the same malarky, wording, and images. The activities of these people, some of whom probably were just garden-variety trolls, eventually led to Tesla shutting down those forums.

These.. entities.. probably didn't go away just because Tesla shut down their main play space. It is noted that Tesla has famously (until very recently) never done advertising. This means that potential buyers search the web to get information about the car. Google and other search engines index forums like this one. And what would be better for a corporate entity tasked with Slowing Down BEV adoption to have the prime forums discussing the car filled with posters screaming, "Junk! It's Dangerous! Don't Use It If You Value Your Life!".

OK, I get it. Some people do have excessive problems with FSD-b. Heck, I've posted here with info on near-accidents, red-light running, and all that. But.. it is a beta. And, frankly, with an old (relatively speaking) 2018 M3 that should probably be performing worse than everybody else's stuff. But that doesn't appear to be the case with some of the more vociferous posters around here, whose cars appear to be death traps. Which is definitely not the case with my daily driver on Beta which has been all over the eastern half of the U.S.. Bugs, yes. Death and destruction? Nope.

We, unlike the Tesla.com forums, do have moderators with teeth. So the same level of egregious conduct that was de rigure over at the tesla.com hangout wouldn't be tolerated here. Which implies that, if we do have paid-for trolls hanging about, they'd definitely have to tone it down quite a bit. But it wouldn't necessarily stop them.

So, no proof, either way. But sometimes a little paranoia is in order, because sometimes they are out to get you.
 
Last edited by a moderator:
So the car decelerated 10 mph before its turn signal came on

7 mph (35 -> 28)

I disagree about it being FSDb's fault. Some car slowing down in front of you isn't grounds for rear-ending them.

I agree that it wasn't done perfectly, but in this case, FSDb was basically driving like someone who isn't from that neighborhood (trying to figure out where to go). This isn't any worse than a typical human driving somewhere they aren't used to (imo).

(Right now) It's unreasonable to expect FSDb to drive everywhere like it's familiar with the area.

Many humans drive with hypocrisy. They expect all others to drive like them, but when they themselves drive in an unfamiliar area, they make many of the same "mistakes."
 
Last edited:
  • Like
Reactions: EVNow, GSP and rlsd

That someone posted a BS FUD story? Not surprising at all.

Your source said:
Then a Tesla Model Y approached on North Carolina Highway 561.
The car — allegedly in Autopilot mode — never slowed down.

NC 561 is a one-lane-each-way undivided, not-controlled access, road.

Exactly the type of road the manual specifically states Autopilot is not intended for.

Hilariously, later apparently without realizing it, the story even admits that- "Autopilot, largely a highway system, operates in a less complex environment than the range of situations experienced by a typical road user."

So...user error.

In fact, so far every time NHTSA investigated an autopilot crash they concluded...user error- no fault in the system.

Oh, in fact, it's even worse than that! Later in the story they ALSO admit the driver had "fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands"

So 1000% driver error-Not only was he using a system somewhere it's not intended for use, he actively defeated one of it safety systems.... yet they want to blame the tech. Stupid.
 
You can imagine as a pedantic professor, I cringe when members make these mistakes. This said, it could be worse; people could be using “breaked“ for the past tense version of break instead of the correct brake and braked. LOL.

ok I used to think people casually misused "break" for "brake" because they are pronounced exactly the same. But now it appears some people actually think "break/broke" is the correct word.

It's "brake/braked".
 
"Beta" classification is misused and abused by Telsa - Defects rates fit more a pre-alpha classification. Also, many Telsa non FSD beta features have been in the "Beta" state" for nearly a decade. Fact.
Our Toyota Sienna has "lane assist" feature with equal or more bugs - and yet is a "production" version. So does our door bell that will recognize a waving shrub as a person and send us "human alert". Whether its called alpha or beta or production does not really matter. Don't get me started on Alexa or Siri.

I think it is just a matter of some people not being able to come to terms with state-of-the-art NN and associated issues.
 
  • Like
Reactions: Yelobird
That someone posted a BS FUD story? Not surprising at all.



NC 561 is a one-lane-each-way undivided, not-controlled access, road.

Exactly the type of road the manual specifically states Autopilot is not intended for.

Hilariously, later apparently without realizing it, the story even admits that- "Autopilot, largely a highway system, operates in a less complex environment than the range of situations experienced by a typical road user."

So...user error.

In fact, so far every time NHTSA investigated an autopilot crash they concluded...user error- no fault in the system.

Oh, in fact, it's even worse than that! Later in the story they ALSO admit the driver had "fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands"

So 1000% driver error-Not only was he using a system somewhere it's not intended for use, he actively defeated one of it safety systems.... yet they want to blame the tech. Stupid.
As much as FSD is a giant pile of stink right now, I don’t disagree with your post.

For the record, and on another note, my 2016 VW Golf R had RADAR and adaptive cruise and it never, ever, ever had a single instance of phantom braking. How come Teslas, even with RADAR before it was disabled, struggle here?
 
Did I miss anything?
Running stop signs.

- Not all, mind you, but for me, one very consistently.

11.4.2 and 11.3.6 demonstrated a 100% intent to run (earlier versions had an occasional random success). On 11.4.2, even moved to pass a car properly stopped, blinker on, waiting to turn left. All failures have been reported.

Intent to drive past left turning car:
2023-06-05_go-around-leftturning-car-CROP.jpg


Example intent to run stop sign. I drop approach speed to 15mph to make braking easier when it fails to stop. The stop sign is visualized in both images:
2023-06-06_run-intent_crop.jpg


p.s. fave 11.3.6 improvement - quick slowdown using wheel! This is huge!

Most annoying regression:
* IMPROVEMENT: 11.3.6 finally tracked right on narrower unmarked road, to the point of even adding imaginary centerlines to some sections. [downside was car getting hit by more overhanging branches]
* REGRESSION: 11.4.2 back to the rough center of the unmarked road, accompanied by wandering.