Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Users that mistake NoA for FSD hit a wall (literally)

This site may earn commission on affiliate links.
Technically, that’s freeway conditions which are covered by “legacy” AP software not FSD beta.There are still two software packages running, and Elon promised to merge them in the future. AP used to have real problems with freeway curves with jersey barriers along them. It didn’t follow me them very well. I’ve noticed that’s improved lately.
 
  • Like
Reactions: modl3mike

Its really about hope

AP -> No hope left
FSD -> Some hope still left”

But that’s really not good, isn’t it. FSD guys are mostly focused on getting their bug rate under control, especially with regressions.

Can‘t a company like Tesla find a few good engineers to get the AP stable enough? Just because FSD is a priority doesn’t mean it has to be the ONLY thing being worked on? Especially since the two stacks are supposed to be merging. That means AP stack won’t be thrown away, it incorporated into the larger stack with FSD.
 
  • Like
Reactions: 2101Guy and cwerdna

Its really about hope

AP -> No hope left
FSD -> Some hope still left”

But that’s really not good, isn’t it. FSD guys are mostly focused on getting their bug rate under control, especially with regressions.

Can‘t a company like Tesla find a few good engineers to get the AP stable enough? Just because FSD is a priority doesn’t mean it has to be the ONLY thing being worked on? Especially since the two stacks are supposed to be merging. That means AP stack won’t be thrown away, it incorporated into the larger stack with FSD.

It's probable AP is close to as good as it can be with its current strategy and sensor set. Fixing the problem in this instance would require the full video 3-d internal state perception stack which attempts to estimate 3-d point clouds & objects, that's part of FSD. Or a high resolution radar.

ML systems are often not so simple that easy bug reports or patches can be added. There's no special subroutine for driving near barriers that can be tweaked. Adding more data examples to improve performance on some cases can end up hurting performance on others.
 
If they can't get smart summon to work correctly how are they going to get level 4 or 5 to work?

There are two answers I have for this.

Smart Summon was done before they had the necessary Tesla Vision components to do it well.

They'll never have L4/L5 in HW3 vehicles because they lack the ability to sense a human nearby the car, and there is too much risk of running someone or something over.
 
AP should have been using the acoustic sensors’ data to center the vehicle between the Jersey barriers along that lane & prevent hitting them. It might be a good idea to have that system and it’s log data checked, perhaps it has something wrong.
 
Can't help but love the new title. 😂Thought it was new thread debunking the original title.

The "old" AP stack doesn't have the ability to identify lots of things and why it can hit objects like parked fire trucks or fixed barriers. The wall probably falls into this same category and it made a slight mistake drifting over the line not understanding the wall.

This would be fixed in Beta 11.x when we get the integrated Stack. We should get 11.x on Dec 28 a FULL 3 days to test before Robotaxes become operational.o_O🤣
 
  • Like
Reactions: X-pilot
While I agree that this was AP/NoA and not the new FSD Beta stack, I also have to wonder a few things about the console at the time. I want to see what the screen was showing before and during the incident. I also want to know about the ultrasonics - even if the cameras didn't notice the wall for some reason, why didn't the car react when the ultrasonics showed the car inches from the barrier? When I'm on the freeway, ultrasonics show yellow and red when cars/trucks/semis drift a little over their lane lines and close to me. I've never had a collision, but I've often wondered how ultrasonics are incorporated into the stack.
 
  • Helpful
Reactions: Silicon Desert
While I agree that this was AP/NoA and not the new FSD Beta stack, I also have to wonder a few things about the console at the time. I want to see what the screen was showing before and during the incident. I also want to know about the ultrasonics - even if the cameras didn't notice the wall for some reason, why didn't the car react when the ultrasonics showed the car inches from the barrier? When I'm on the freeway, ultrasonics show yellow and red when cars/trucks/semis drift a little over their lane lines and close to me. I've never had a collision, but I've often wondered how ultrasonics are incorporated into the stack.

It's likely that the ultrasonics don't have range or time to react to something coming up from the front at speed. I think they might be used for detecting lane intrusions but it would probably need a consistent signal for some time to be deemed to be real (otherwise frequent phantom braking), and they don't have enough range or precision pointing forward to detect rapidly incoming barriers.

Ultimately, stereoscopic vision or direct radar/lidar is going to be needed. I'd prefer both stereo/trinary vision and high resolution radar. Also current vision sensors are not at all 'retina quality', they're remarkably low resolution, 1280x960 and fixed ahead, unlike eyeballs.

I conclude real FSD will need a new hardware stack: higher res and more cameras with overlapping fields, which means many more bits to push through the vision nets, which means new and expensive computation hardware, plus high resolution 'imaging/scanning' radar.

This cost should be included for $12,000.
 
  • Like
Reactions: _Redshift_ and Dewg