Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
Clearly they have rewritten and continue to improve every part of their system and when they do they show a clearly improvement based on statistics (ex:vectornet). The problem with Tesla is that their so called “rewrite” has no basis in actual statistical improvement, it’s all conjecture and hype. “Quantum leap” “1000x labeling”, “this time it’s right”.

When you actually look at it. This isn’t a rewrite. This is actually the beginning of their autonomous driving development. This is why Andrej categorizes it as networks for FSD. This actually showcases how behind they are. Another thing that’s avoided is that while Andrej focuses on their alternative to maps. They don’t discuss the elephant in the room which is prediction and planning of dynamic actors. Currently they only have a cut in prediction network. If mapping was the crucible of self driving then those other dozens of companies should have hundreds of driverless fleets roaming around.

How will FSD release handle double parked cars or cars backing up in-front of it or people doing u-turns, k-turns, k-turns in-front of it? Etc.

Remember what one driver does affects what another does which affects what another does. So you have to predict ahead and know what will happen and be prepare

What I find funny is in a sense the end of the FSD feature set (on the order page) is actually the beginning.

The reasons is everything under FSD doesn't actually work on a consistent basis. It doesn't work because it Tesla doesn't have the foundation in place for those things to work.

As far as I can tell what this "re-write" does is it provides this foundation so the existing EAP/FSD features work a lot more consistently.

Is it going to be L4? Probably not even on a very limited area, but will it vastly improve L2 driving? I hope so.

It will get us to the next limitation whatever that might be.
 
They don’t discuss the elephant in the room which is prediction and planning of dynamic actors

I believe the core problem is sensing, which is even more basic. The system must have high confidence as to all the potentially relevant aspects on the surrounding environment. Everyone but Tesla decided it takes a couple hundred thousand dollars of hardware to ensure that objective.

The reason enhanced Summon drives like a legally blind 90 year old grandmother is that only the ultrasonic sensors provide sufficient certainty of the surrounding environment. Until Summon drives like Waymo, Tesla is signaling that their camera/radar system doesn't produce a sufficiently reliable interpretation of the environment.

The two good enough sensor systems in a Tesla today are ultrasonic at a crawl and the human driver.
 
I believe the core problem is sensing, which is even more basic. The system must have high confidence as to all the potentially relevant aspects on the surrounding environment. Everyone but Tesla decided it takes a couple hundred thousand dollars of hardware to ensure that objective.

The reason enhanced Summon drives like a legally blind 90 year old grandmother is that only the ultrasonic sensors provide sufficient certainty of the surrounding environment. Until Summon drives like Waymo, Tesla is signaling that their camera/radar system doesn't produce a sufficiently reliable interpretation of the environment.

The two good enough sensor systems in a Tesla today are ultrasonic at a crawl and the human driver.

I would rank the ultrasonic below that of the Cameras because they tend to miss a lot of things depending on the height of the object.

The entire point of the rewrite is to improve sensing/localization especially with things like semi-trucks that extend beyond one cameras range. You ever notice how difficult it is to do an auto-lane change into the middle lane when a semi-truck is in the right most lane? The car will attempt it, but then it will suddenly think the semi-truck is in the destination lane and will cancel.

My expectation is the re-write will allow this kind of lane change to happen in a repeatable, and reliable manner. Where its far less likely to embarrass the driver.

Is it good enough for anything beyond L2? Probably not because of the lack of redundancy. A lot of the reason you want to have additional sensors like rear corner radar is redundancy, and it allows for a system to be correcting over the long term. Like a disagreement between the two sensors can act as a trigger to upload data to the mothership for a fix.

My hope for the rewrite is it will finally teach Elon that the existing sensors are insufficient. Thankfully its timed in accordance with a massive reduction in cost of other sensors, and not just a reduction but simplification.
 
My hope for the rewrite is it will finally teach Elon that the existing sensors are insufficient

I think changing sensors is years away. I think the plan is to 1) Continue to make the current system more useful and 2) Make it good enough to blame regulators on not allowing the system to go beyond level 2.

Since a camera based system will eventually be good enough Tesla can probably stay on that path until/if other car makers release privately owned FSD cars.

Tesla's risk here may be more with FSD on semi. So perhaps Tesla will achieve level 3 on the highway.

Where I think Tesla runs into a medium term problem is how to explain HW4.
 
  • Disagree
Reactions: mikes_fsd
Where I think Tesla runs into a medium term problem is how to explain HW4.
HW4 will be on the Cybertruck (I will need to find the source video, but if I remember correctly even Elon alluded to it)
After it is on the Cybertruck in the next ~18 months, that hardware would propagate through the rest of the product line.

But, before the "sensor change" gestapo shows up.

I disagree wholeheartedly that the fundamental approach will change.
This will be a "Tesla Vision" system based on cameras + radar + ultrasonics.
The compute power will definitely double or triple from HW3.

An additional note.
Why does Tesla have to explain HW4?
They would have to explain why there isn't a HW4 with 3 to 4 years of HW3 being in production.
Just taking one thing: going from the 14 nm process to sub 10 nm (hopefully) process.
Just the power efficiency gains alone are worth going down that road, but you gain more than just power you get to fix your weak spots in the old design and address any additional on chip redundancy that you might need.
 
Last edited:
  • Like
Reactions: outdoors
I believe the core problem is sensing, which is even more basic. The system must have high confidence as to all the potentially relevant aspects on the surrounding environment. Everyone but Tesla decided it takes a couple hundred thousand dollars of hardware to ensure that objective.
2010 called, they want their prices back. Self driving suites don't cost hundreds of thousands. Literally iphones have lidars in them for crying out loud.

The reason enhanced Summon drives like a legally blind 90 year old grandmother is that only the ultrasonic sensors provide sufficient certainty of the surrounding environment. Until Summon drives like Waymo, Tesla is signaling that their camera/radar system doesn't produce a sufficiently reliable interpretation of the environment.

The two good enough sensor systems in a Tesla today are ultrasonic at a crawl and the human driver.

As have been stated, the ultrasonic have heavy false positive and false negative and range so small its useless for self driving

But, before the "sensor change" gestapo shows up.

I disagree wholeheartedly that the fundamental approach will change.
This will be a "Tesla Vision" system based on cameras + radar + ultrasonics.
The compute power will definitely double or triple from HW3.

Ultrasonics are not a self driving sensor, people need to stop looping it into cameras/radar. And their front radar is practically useless because of its low FOV, low resolution, and the fact its not heated. It also doesn't help you in 95% of the driving scenarios. These driving scenarios are actually the hardest. There's nothing hard about driving forward. But in when driving forward it has major problems detecting and differentiating stationary objects. So at the end of the day, they basically have a camera only system.
 
2010 called, they want their prices back. Self driving suites don't cost hundreds of thousands. Literally iphones have lidars in them for crying out loud.

Sure. Provide a link to an inexpensive self driving system. A better mobile mapping system, which is analogous to a portion of what Waymo has on car, is a half million dollars.

....range so small its useless for self driving

Which is exactly why Summon creeps around like an old lady. It's useless except for parking.
 
Sure. Provide a link to an inexpensive self driving system. A better mobile mapping system, which is analogous to a portion of what Waymo has on car, is a half million dollars.

Mobileye's system costs $15,000 with compute, cameras, lidars and radars.
Waymo made their own in-house Lidar of their most powerful lidar and made it 90% cheaper than the velodyne variant. From $75k to $7.5k 3 years ago. Its been several years which means its most likely even more cheaper as other lidar manufacturer prices have dropped as-well.

They also sell their honeycomb which is their short range lidar and we already know that this lidar is way cheaper than their powerful 360 lidar that cost only $7.5k in 2017. Their honeycomb probably costs around $3k to make.

Their system have 4x honeycomb short range lidar and 1x 360 long range lidar.
When you actually do the math of their Lidar system in 2020.
Its most-likely around $20k for their lidar system.
We know cameras and radars are very cheap in contrast to lidar.
Their overall sensor suite with lidars, radars and cameras would cost well under $40k

Waymo to start selling standalone LiDAR sensors – TechCrunch

EDIT: $40k is a conservative number, the true number is probably around $20k. With Waymo's 5th gen they halved the cost of their sensor again. So the $7.5k sensor now costs around $3k

"With each generation of our custom hardware we’ve been able to bring down the cost of our sensors while delivering even more capabilities and compute power. With our fifth-generation hardware, we’ve simplified the design and manufacturing process so it can be production-ready, and our latest sensors deliver more performance than ever before, at half the cost of our previous generation."

Waypoint - The official Waymo blog: Introducing the 5th-generation Waymo Driver: Informed by experience, designed for scale, engineered to tackle more environments
 
Last edited:
Sure. Provide a link to an inexpensive self driving system. A better mobile mapping system, which is analogous to a portion of what Waymo has on car, is a half million dollars.
The Cruise Origin, on the other hand, will spend most of its life in motion, working 10 times harder than your average car, day in and day out. It’s engineered to include everything you need and nothing you don’t. It doesn’t require a driver, of course. Because it’s modular, it will have a lifespan of over 1 million miles — six times more than the average car. And since GM has committed to producing millions of electric vehicles, we’ll build it for roughly half the cost of what a conventional electric SUV costs today.
Sort of an odd statement since they seem to be comparing manufacturing cost to retail cost. Way lower than $500k though.
The Cruise Origin Story
 
  • Like
Reactions: diplomat33
Mobileye's system costs $15,000 with compute, cameras, lidars and radars

Which is not a FSD system.

Systems actually delivered at scale will need to be $10K to $50K. The DEVELOPEMENT systems are much more expensive. Tesla developers have had to work with very inexpensive in car systems. Meanwhile all the other developers did not have to work around these cost limitations.

Waymo isn't driving their production system yet because it took a very expensive development system to actually make a robotaxi.
 
Which is not a FSD system.

Systems actually delivered at scale will need to be $10K to $50K. The DEVELOPEMENT systems are much more expensive. Tesla developers have had to work with very inexpensive in car systems. Meanwhile all the other developers did not have to work around these cost limitations.

Waymo isn't driving their production system yet because it took a very expensive development system to actually make a robotaxi.

first of all this is the cost of their dev system both for Mobileye and for Waymo. Secondly waymo’s cars are their production system.
 
  • Like
Reactions: diplomat33
Which is not a FSD system.

I think the Mobileye system that Blader is describing is the FSD system.

Waymo isn't driving their production system yet because it took a very expensive development system to actually make a robotaxi.

False. Waymo is driving their production system on the Jaguar I-Pace and the Chrysler Pacifica. What you see on the Chrysler Pacifica robotaxi in Chandler taking passengers is the production system.
 
Last edited:
Mobileye beat Waymo to a commercialized level 4/5 system? At the price you quote all car manufacturers must be rushing to integrate this system into their cars. Tesla may as well drop their development efforts as they have lost badly.

No, Mobileye has not commercialized their L4 yet. I merely meant that the system that Blader is describing is their L4 suite that they are testing and that will most likely go on consumer cars when the system is commercialized.
 
Mobileye's system costs $15,000 with compute, cameras, lidars and radars.
Waymo made their own in-house Lidar of their most powerful lidar and made it 90% cheaper than the velodyne variant. From $75k to $7.5k 3 years ago. Its been several years which means its most likely even more cheaper as other lidar manufacturer prices have dropped as-well.

They also sell their honeycomb which is their short range lidar and we already know that this lidar is way cheaper than their powerful 360 lidar that cost only $7.5k in 2017. Their honeycomb probably costs around $3k to make.

Their system have 4x honeycomb short range lidar and 1x 360 long range lidar.
When you actually do the math of their Lidar system in 2020.
Its most-likely around $20k for their lidar system.
We know cameras and radars are very cheap in contrast to lidar.
Their overall sensor suite with lidars, radars and cameras would cost well under $40k

Waymo to start selling standalone LiDAR sensors – TechCrunch

EDIT: $40k is a conservative number, the true number is probably around $20k. With Waymo's 5th gen they halved the cost of their sensor again. So the $7.5k sensor now costs around $3k

"With each generation of our custom hardware we’ve been able to bring down the cost of our sensors while delivering even more capabilities and compute power. With our fifth-generation hardware, we’ve simplified the design and manufacturing process so it can be production-ready, and our latest sensors deliver more performance than ever before, at half the cost of our previous generation."

Waypoint - The official Waymo blog: Introducing the 5th-generation Waymo Driver: Informed by experience, designed for scale, engineered to tackle more environments

$15 thousand dollars is a HUGE cost! That's 15k for their own cost. Even $3k is a ton too. Like that's over twice the cost of higher volume ICE engines.

Put any of these multi thousand dollar costs through the supply/retail chain and it's several times that. You should see how much trouble the auto industry goes through to save 10 or 20 dollars here and there.

A few hundred dollars in cost is a gigantic amount of money, thousands is totally unworkable for mass market. Auto industry has a profit margin of about 5% give or take, after all the mark ups.

Maybe if it's $150 cost to the OEM (not the supplier) you might get mass adoption. This is why Elon keeps saying "Lidar is doomed." He thinks they can get vision-based system working faster than Lidar and other sensor suites can come down in price.

It took something like 30 years for anti lock brakes to become standard (by law), if that tells you anything.
 
....


False. Waymo is driving their production system on the Jaguar I-Pace and the Chrysler Pacifica. What you see on the Chrysler Pacifica robotaxi in Chandler taking passengers is the production system.

Waymo iPace isn't a production car by my definition. Half the vehicle isn't even used. What Waymo is doing now is testing the whole ecosystem, including paying customers.

When Waymo gets serious about making money they will scale and likely use a purpose built taxi. Tata could build a taxi on the iPace skateboard. At that point most or all sensors will be likely integrated and less expensive than they use today. Certainly Waymo and the other major efforts are pursuing sensor and other hardware IP.

I think Waymo built the Flash and then had a failure to launch based on not meeting development criteria. So I think the Frankenvehicles they are currently running are a conservative response to previous over optimism.
 
He thinks they can get vision-based system working faster than Lidar and other sensor suites can come down in price.

So far, that does not appear to be happening. The price of lidar is coming down fast and companies like Cruise and Waymo are deploying driverlesss robotaxis on public roads. Meanwhile, Tesla is not developing their Tesla Vision fast enough to match. Tesla is just now *maybe" achieving a basic FSD prototype with driver supervision (we will know more when we see the FSD rewrite).