Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
No…the rectangular clipped region that is displayed is very likely, almost certainly, not the full image used by FSD.
The impact point was on the front of the car, certainly out of view of the b-pillar, and certainly out of view of the wide angle (limited by enclosure so clipping not an issue), so that actual obstacle was certainly not visible.

Of course, anyone looking at the video would have no problem extrapolating what was visible (Wide: car. Right Door Pillar: car!), back to a very solid rear bumper that the car was about to hit. But the actual impact point was not visible.
 
The arguments about front blind spots will continue until Tesla adds cameras to every models' front bumper (or something equivalent).


TBF this specific type of issue could ALSO be fixed if Tesla ever actually added an understanding of object permeance (and had accurate vision-only distance measurements)..... Things they've repeatedly claimed they're going to do, but this accident shows they still haven't.

It won't fix the WHOLE thing as well as a couple low-mounted cameras would though-- specifically if you park the car it can't know if any new things are in its blind spots some time later when leaving that weren't there originally, and there's probably a few very tight turn situations where objects while the car is moving would never appear on the cameras to be clocked and remembered... and lastly if something IN a blind spot becomes non-stationary it can't pick that up.
 
TBF this specific type of issue could ALSO be fixed if Tesla ever actually added an understanding of object permeance (and had accurate vision-only distance measurements)..... Things they've repeatedly claimed they're going to do, but this accident shows they still haven't.

It won't fix the WHOLE thing as well as a couple low-mounted cameras would though-- specifically if you park the car it can't know if any new things are in its blind spots some time later when leaving that weren't there originally, and there's probably a few very tight turn situations where objects while the car is moving would never appear on the cameras to be clocked and remembered... and lastly if something IN a blind spot becomes non-stationary it can't pick that up.
Maybe this is why v11 stops so far back from cars in front. Only a minor issue/non-issue but seemed to be adjusted in recent releases.

Speaking of blind spots, I wish v11 did not insist on changing lanes into people’s blind spots, or even into their non-blind spots. Why not just change lanes into a space into which another car cannot move? Isn’t this basic defensive driving? Really bugs me on the freeways, and everywhere to be honest. On the freeways especially though, because I fear a PIT maneuver.

Maybe v12 will learn this behavior! Would be a good case of emergent capability. Overcomes an impossible programming problem.:rolleyes:
 
  • Like
Reactions: hybridbear
To be clear - more sensors MUST be in combination with improvements in software. Waymo has dozens of sensors all over the car, and still crashes into things because of the planner. Telsa can add cameras everywhere and still run into things.


Has Waymo hit anything recently it could actually sense?

AFAIK most of their accidents were people hitting THEM, not the other way around...

The only recent accident the other direction I'm aware of is this one:

Where the biker was hidden behind a truck and as soon as it ceased being obscured the Waymo braked heavily, but too late to stop. Had nothing to do with the planner and everything to do with no way to sense the obstacle until it was physically too late to avoid.

You can't ever prevent accidents where braking time is greater than the time from when you first sense the object.

But can certainly improve your odds of not hitting things at very low speed in blind spots by at least adding sufficient sensors to not HAVE blind spots directly adjacent to the vehicle.
 
You mean other than the truck that was being towed that two different Waymos ran into on the same day?


That was a couple months before the incident I mention (hence why I said that one was the most recent I'm aware of) and they already issued a SW update to fix the issue that caused the one you cite...

The cause there was cited as "due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle."
 
  • Like
Reactions: Daniel in SD
As far as I know there has never been a case of a Waymo hitting a stationary object. All their collisions in California are reported here: https://www.dmv.ca.gov/portal/vehic...ehicles/autonomous-vehicle-collision-reports/
They claim in their Arizona safety report that they have never hit a stationary object.

I really wish we could see the video of this towed vehicle they hit (twice!). It should be noted that they did brake for the towed vehicle but not in time.
 
That was a couple months before the incident I mention (hence why I said that one was the most recent I'm aware of)
Not what you said...

The only recent accident the other direction I'm aware of is this one:
https://www.reuters.com/world/us/dr...-francisco-causes-minor-scratches-2024-02-07/


and they already issued a SW update to fix the issue that caused the one you cite...
And what is your point? They could sense it, and yet still hit it. Sure, they fixed that situation, but how many other situations are there that they can sense something but will hit it anyhow? We just don't know.

But in that case it ignored what they could sense, and instead trusted its hallucination.
 
Last edited:
  • Like
Reactions: sleepydoc
I wish v11 did not insist on changing lanes into people’s blind spots, or even into their non-blind spots.
After further consideration, I think this is the number one issue with FSD, Freeway Edition, v11. It also doesn’t add buffer for other vehicles in adjacent lanes. Why only for trucks? Bizarre, completely unnatural.
Had nothing to do with the planner and everything to do with no way to sense the obstacle until it was physically too late to avoid.
Curious about that case. Never will get video of that either. What a cluster. There should be laws.
 
  • Like
Reactions: hybridbear
But in that case it ignored what they could sense, and instead trusted its hallucination.


That's simply not correct.

See Daniels clarifications above--- Waymo doesn't hit stationary things like Teslas do. It thought the vehicle in question would be moving out of its way by the time it got there, and the fact it was facing a different direction than the tow truck actually moving it caused the issue-- it DID brake when it realized it was wrong, but too late to stop the motion in time.
 
That's simply not correct.

See Daniels clarifications above--- Waymo doesn't hit stationary things like Teslas do. It thought the vehicle in question would be moving out of its way by the time it got there, and the fact it was facing a different direction than the tow truck actually moving it caused the issue-- it DID brake when it realized it was wrong, but too late to stop the motion in time.
Sounds like it hallucinated that the truck was moving in the opposite direction it was actually moving, and that the lidar and radar would have been indicating it was moving. (i.e. it ignored what it was sensing until it was too late.)
 
Speaking of blind spots, I wish v11 did not insist on changing lanes into people’s blind spots, or even into their non-blind spots. Why not just change lanes into a space into which another car cannot move? Isn’t this basic defensive driving? Really bugs me on the freeways, and everywhere to be honest. On the freeways especially though, because I fear a PIT maneuver.
Yes! I have had to take over multiple times because the other vehicle begins changing lanes because it doesn't know of my presence since FSD Beta changed lanes into a blind spot. I've had to swerve many times to avoid being hit. When FSD Beta turns on the signal & I see that I would be changing lanes into a blind spot I usually use the turn signal stalk to cancel the lane change.
After further consideration, I think this is the number one issue with FSD, Freeway Edition, v11. It also doesn’t add buffer for other vehicles in adjacent lanes. Why only for trucks? Bizarre, completely unnatural.

Curious about that case. Never will get video of that either. What a cluster. There should be laws.
This is also another issue. When an adjacent car is hugging the line it should also move over. My wife often complains about this when FSD Beta is driving on the freeway.
 
  • Like
Reactions: AlanSubie4Life
In a different but similar scenario: When I have my model s parallel park (v11 w/uss), it will just whip into the spot faster than I could ever react. It always makes me nervous about the front left corner of my car. Good news is that it hasn't screwed up yet...
Two things:

First, I'm going to second @AlanSubie4Life: If the car is doing things faster than you can react, then you shouldn't be doing things with that car.

Second: In the v12/v11 threads there's been a couple of people mentioning that the car, running FSD, "does things really fast, faster than I can keep up".

I've been plowing around in the various variants of FSD-b since getting into the program back, I think, in 2022, when it got opened up a bit wider than those with the 100% safety scores. And, back then, yeah, one really had to keep an eye on what the car was doing because it would do insane things: Running red lights, stop signs, changing lanes directly into somebody to one's left, and those are the ones that come to me after ten seconds of thought. Tesla said that the car would do the wrong things at the worst times and they absolutely weren't kidding.

But, having said that: I never had a problem with intervening before the car did some cock-a-Mamie thing. In fact, it was almost weird about, well, how much time I had to stop the car from doing something. The car's going to run a red light? It wouldn't just floor it and take off at high acceleration, it would start to move forward with Plenty Of Time to put one's foot on the brake. If it tried to change lanes directly into the car to one's left (which only happened, I think, twice in all that time, and was memorable enough so I've remembered it) it would start moving over and, with my hands on the wheel and a what-the-heck reaction, no problem in stopping the car before it had moved just over a foot.

If it was coming up on a stop sign and not slowing down, hitting the brakes to stop wasn't exactly hard, or required zero reaction time. And the other one that I Really Remember was this one place where some construction was going on and the right lane was closed off with Jersey barriers; until an appropriate FSD load finally showed up after roughly a year, the car would move into the right lane and then halt, with the end of a Jersey barrier in front of one. (Or, rather, it would try to do that - after the first time, I'd just intervene. And, of course, sometimes it wouldn't do that.)

But never, and I mean never, did the car do something so blamed abrupt that, given human reaction time, there wasn't time to intervene.

Let's be clear, here: I'll be the first to say that it's physically possible for a Tesla, under FSD, to do something like drive into a tree or fire hydrant without giving the human occupant time to correct. It takes roughly a tenth of a second for a human to notice something and start to do something about whatever-it-is; the car's programming is clearly tons slower than that.

Now, if one is playing a video game while driving or reading War and Peace than, yeah, if one isn't paying attention the car can do something deadly. But that's why the eyes-front/torque on the steering wheel/deathly warnings from Tesla to pay attention are all there.

So, what's going on here? Are you cars really whipping the steering wheel/gassing it/braking it faster than you can react, or are you doing the Internet Thing of trying to create a controversy when there isn't one?