Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
I agree for the most part. Save for a driverless L4 geofenced service, the monitoring of the driver via camera or nags for alertness will likely be mandatory on any partially-autonomous system in the near future.

As for what they're targeting? If we take just Elon's word for it, it's ultimately L5.

It's most definitely going to be mandatory for L3, and there is the question as to whether L3 will even be allowed to travel at freeway speeds. I also doubt L3 will ever be allowed on cities streets.

The problem with driver monitoring is you can only go so far with it. It's a great way at detecting someone falling asleep, and curbing cell phone usage. In fact I think Europe is going to make it mandatory not just for partial autonomous system, but for manually driven vehicles as well.

With partial autonomation there is only so far driver monitoring can go. if I had a Cadillac Supercruise vehicle I would almost certainly stare off into the void in front of me without really paying much attention at all.

I think L2 is mostly just a rigged game against human drivers. Where the car does the driving, but we take the fall. The manufacture of the L2 system doesn't have to take responsibility, and this is why they're mostly garbage.

The existing FSD on cities streets is light years ahead of any of the L2 systems on the market including Tesla's own NoA which is kinda hilarious since a person with the FSD beta will have a system that works great on cities streets, but is still plagued with the same NoA issues that have been reported people using NoA like myself). Now that likely won't be the case for much longer, and I look forwards to seeing NoA leap in capability.

My fear is that FSD will be so good that it kills itself off because it doesn't have anywhere to go. It can't leap from L2 to L4 because L2 gets so good that drivers get complacent, and edge cases kills a few people. The media erupts in anger, and the whole thing is shut down.

Or the FSD beta gets stuck in a limited release for a really long time (years) so it simply doesn't have the mass involvement of hundreds of thousands of vehicles. But, this is the only way to avoid bad publicity because the odds of a string of deaths happening is much lower.

Even if it does somehow achieve FSD at non-geofenced, but still L4 (due to weather limitations) system it might never be allowed to turn on due to the lack of sensor redundancy.

So to me the destination of FSD isn't anything less than the creation of the destination itself. Or in other words the end game for FSD is not to do autonomous driving, but to create the possibility of FSD.

It's basically the sacrifice needed to determine what exactly the public will be okay with. It's forcing the public to realize this is here, and its their move.
 
  • Like
Reactions: totoro722
If you look at my list of requirements that I would like for a L4 infrastructure you'll see that I have what's necessary for those situation.

It's connected to the mothership so remote takeover can happen. The car can basically call home, and ask for assistance.

It has V2X capabilities so the cop can communicate instructions not to just the autonomous car, but all the other cars that are equipped with V2X. In addition to this the geofencing allows the cop to have the training necessary on how to communicate with the car. Whether its through digital means or through hand communication.

So I would argue this exact type of scenario is why L4 makes more sense to me than L5.

I simply don't see an L5 vehicle being able to handle situations of this nature.
I guess you could call it an L4 system because it is still fenced, not by location but by situation
So in essence it’s not a ‘geofenced’ system but a ‘theofenced’ as in theoretically fenced
But I agree that it would be ridiculous to prevent a robotaxi service for these 1 in 1 million scenarios, especially as the robotaxi is not harming anyone
 
I guess you could call it an L4 system because it is still fenced, not by location but by situation
So in essence it’s not a ‘geofenced’ system but a ‘theofenced’ as in theoretically fenced
But I agree that it would be ridiculous to prevent a robotaxi service for these 1 in 1 million scenarios, especially as the robotaxi is not harming anyone
Geofencing robotaxis away from specific, difficult (such as very narrow) segments of road is very likely IMO.
 
The Audi system was never released anywhere. As far as I know Mercedes and Hyundai still plan on releasing systems. Not sure if Mercedes recent cancelling of their autonomous vehicle program changes that.

So much fake news in this thread. Do you really believe that MB will cancel their autonomous vehicle program?
 
So much fake news in this thread. Do you really believe that MB will cancel their autonomous vehicle program?

It seemed like a credible source, but you're right that it was false: UPDATED: Mercedes-Benz throws in the towel on self-driving efforts: 'We can no longer win'

Update: Head of Digital Transformation at Daimler AG Sascha Pallenberg has noted on Twitter that the report from RedaktionsNetzwerk Deutschland (RND) is false, and that Mercedes-Benz’s autonomous program is still ongoing. A quote previously misattributed to the Mercedes-Benz head has also been corrected.
 
L5 is just a fantasy designed to convince people we don't have serious work to do in getting this all to work.
I'm sorry, but you've decided that L5 is a fantasy because you do not have enough creativity/vision to see how to solve it, so you do nothing about it.
That is not the case for others that actually tackle these problems head on.
 
It's most definitely going to be mandatory for L3, and there is the question as to whether L3 will even be allowed to travel at freeway speeds. I also doubt L3 will ever be allowed on cities streets.

It's allowed right now in multiple US states.

Just nobody sells a car that does it.

So it L4 and L5- same issue of nobody selling a car that does it- but it's legal right now in those places so long as it can obey all the same traffic laws a human must.

No specific monitoring requirement needed either.
 
  • Like
Reactions: willow_hiller
It's allowed right now in multiple US states.

Just nobody sells a car that does it.

So it L4 and L5- same issue of nobody selling a car that does it- but it's legal right now in those places so long as it can obey all the same traffic laws a human must.

No specific monitoring requirement needed either.

I'm kinda amused by the duality of criticisms against FSD right now. Both arguments have merit, but when they come from the same sources they seem a bit contradictory.

On one hand, people have been saying "Elon is lying about regulatory limits. It's already legal and the regulations don't hold Tesla back from deploying anything."

And on the other, I'm hearing now "Elon is unleashing an untested autonomy beta on unsuspecting drivers and pedestrians. It's irresponsible, and the NHTSA is going to shut them down any day now."

For some reason, these critics are perfectly okay with Cruise/Waymo betas where there isn't a driver in the seat, but are terrified of an FSD beta where the driver is still responsible for monitoring and controlling the vehicle.
 
I needed to be somewhere at a specific time so I planned my sleep schedule around it with a nice buffer. The next day I checked again just before leaving, and it was suddenly much longer and the distance seemed to be quite a bit more.
One thing about the SAE driving automation levels is that I don't think there's any timeliness or comfort aspect. As long as it can complete the trip even say twice as long as what you would have taken but without requiring any driver interaction, it could be L5. Even on a smaller scale contributing to the longer overall time, if a vehicle yields and waits "too long" at a 4-way stop sign or a roundabout entrance but eventually makes it through without disengagement, I believe that is still L5. Or even more basic of the vehicle overall drives slower than the speed limit or how fast you would normally drive could be "acceptable."

Addressing comfort is something that will be especially tricky as people have different levels of acceptance. Some simple aspects such as "hard" acceleration or braking or turning (limited to 3m/s² lateral in EU) can be measured but some people just like to go fast. Tesla does like touting "silky smooth," so it seems likely they'll address this but even without, it could still be L5 (even with phantom braking). A different type of comfort is how close the vehicle gets to other vehicles or pedestrians, e.g., driving down narrow streets, and assuming the computer correctly knows exactly how close it is to another object, a human passenger might feel uneasy squeezing by (at least initially then maybe eventually people won't care).
 
You mean the same Mercedes that just announced it was abandoning its efforts at autonomous driving? :)

Apparently that was a false report. Mercedes-Benz announced they're still going forward with their self-driving car program.


Anyway, here's a video of @kimpaquette with the car slowing down for speed bumps. Others report it ignores them and speeds right throuogh. I recall some discussion it seems to do so successfully when the speed bumps are marked with a sign. If that's the case, 2 possibilities come to mind: 1) Marked signs are mapped (HD/not-HD maps). 2) The warning signs increased the confidence the NN has in correctly identifying a speed bump.

Edit: Brandon's video linked above also shows slowing at marked speed bumps (with chevrons, not a sign) at 4:39. Since that's in a residential type street, I'm more inclined that scenario #2 above is what's going on.
 
Last edited:
I'm kinda amused by the duality of criticisms against FSD right now. Both arguments have merit, but when they come from the same sources they seem a bit contradictory.

On one hand, people have been saying "Elon is lying about regulatory limits. It's already legal and the regulations don't hold Tesla back from deploying anything."

And on the other, I'm hearing now "Elon is unleashing an untested autonomy beta on unsuspecting drivers and pedestrians. It's irresponsible, and the NHTSA is going to shut them down any day now."

From the SAME people?

Because I'm certainly, correctly, stating the first- but never even suggested the second.


For some reason, these critics are perfectly okay with Cruise/Waymo betas where there isn't a driver in the seat, but are terrified of an FSD beta where the driver is still responsible for monitoring and controlling the vehicle.


AFAIK Waymo and Cruise have not done no-driver testing until much further along in the process (and much, much, much, more heavily domain restricted in specific areas with very HD mapping)

Teslas "it should work anywhere" approach will be inherently riskier pretty much by definition. Though obviously with a vastly better scalability if they get it working safely.

FWIW I'm not "terrified" of either- but I certainly recognize it's not remotely an apples to apples comparison between the two approaches.
 
For some reason, these critics are perfectly okay with Cruise/Waymo betas where there isn't a driver in the seat, but are terrified of an FSD beta where the driver is still responsible for monitoring and controlling the vehicle.
I don't know if it is the same people saying all those things, but I have heard this criticism as well, although I think there is some merit to it.

Cruise/Waymo/etc. are using on-payroll, trained safety drivers. Tesla, on the other hand, is using not payed, untrained volunteers as safety drivers for their system. These cars are interacting with other vehicles and pedestrians in the real world and there is a chance that could end badly. Remember those videos of people on AP1 climbing into the back seat while the car was driving? Wait until those types of people get a hold of this software and need to do something outrageous for their YouTube clicks.

That's the quickest way to get additional regulation enacted that prevents Tesla owners from operating their FSD software.

Even Uber, with a payed and trained driver, had their development halted due to lack of attention by said driver.
 
  • Like
Reactions: totoro722
I'm kinda amused by the duality of criticisms against FSD right now. Both arguments have merit, but when they come from the same sources they seem a bit contradictory.

On one hand, people have been saying "Elon is lying about regulatory limits. It's already legal and the regulations don't hold Tesla back from deploying anything."

And on the other, I'm hearing now "Elon is unleashing an untested autonomy beta on unsuspecting drivers and pedestrians. It's irresponsible, and the NHTSA is going to shut them down any day now."

For some reason, these critics are perfectly okay with Cruise/Waymo betas where there isn't a driver in the seat, but are terrified of an FSD beta where the driver is still responsible for monitoring and controlling the vehicle.
There's no contradiction, an autonomous vehicle needs to be tested safely until it is safe enough to deploy.
As long as Tesla can ensure that it is being tested safely then it is fine. I think the DMV should require Tesla to prove that it is being tested safely though. Waymo and Cruise are subject to many regulations regarding AV testing here in California.
 
  • Like
Reactions: diplomat33
I do believe that they might cancel their program and license the technology from someone else. However it appears that I have fallen for fake news about that actually happening!

The Germans will never admit defeat. Also they just rolled out a bunch of homegrown autonomy features in the new S-Class, that they are developing with Nvidia. A lot of the big auto cos have multiple autonomy programs ongoing at the same time. Even if MB does drop one, there is no way they will come out and drop what is essentially the future of the automobile.
 
Also, the concern is from a person that doesn't even have anything to lose (chose not to buy FSD).
But who has buyer's remorse for Smart Summon. :p
I like to think I have everything to gain. Credit card is at the ready!
Do you really worry about this? I mean, it certainly could happen for existing vehicles (though it's not a certainty), but surely long term it's not just going to be a viral feature? You may be concern tr***ing. ;)
Well it may be wishful thinking. My real concern is that people will use it and that it won't improve safety. I'll be happy to be wrong.
 
  • Funny
Reactions: mikes_fsd
For some reason, these critics are perfectly okay with Cruise/Waymo betas where there isn't a driver in the seat, but are terrified of an FSD beta where the driver is still responsible for monitoring and controlling the vehicle.

Waymo and Cruise only remove the safety driver AFTER it has been thoroughly tested that it is safe to do so and only in a geofenced area. They do a lot of testing and validation with professional safety drivers before going driverless. They are not just releasing "beta" software without a driver with no testing.