Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
I've never considered Full self drive to mean autonomous. I was surprised later when I found out about the L1 to L5 system and that the FSD on our car was supposed to someday mean driverless. A level 3 system for me would be a full self drive as I always expected to have to be able to take control. Hence the steering wheel and pedal thingies.

I know. I'm a cheap date too. :)
 
I've never considered Full self drive to mean autonomous. I was surprised later when I found out about the L1 to L5 system and that the FSD on our car was supposed to someday mean driverless. A level 3 system for me would be a full self drive as I always expected to have to be able to take control. Hence the steering wheel and pedal thingies.

I know. I'm a cheap date too. :)
You must have a beta version. The release version looks like this:
Tesla-Model-3-Interior-No-Steering-Wheel-01.jpg
 
This little gem was sent to my DM's on twitter.
It would seem that our friend @verygreen has a real habit of redefining terms... this time it is "FSD Rewrite"
And when Vladimir called him out on it, Vladimir got to experience the "strange way" that Green talks about things.
Follow this thread (up and down, where ever Vladimir Grinshpun replies.... https://twitter.com/VGrinshpun/status/1321174517335601152 )
upload_2020-10-28_16-5-54.png



Note what precipitated Vladimir to ask @verygreen point blank "Do you acknowledge the progress in functionality?:
ElW_1dyXUAUkOKx

ElXBBMNX0AEicrz
 
Last edited:
Didn't I see somewhere that Elon said to the pilot....'...your job will be obsolete.' The pilot seemed speechless.

No doubt Elon may have said this, I admire him very much for what he has accomplished, and his vision and commitment,I truly believe he is one of the greatest visionaries we have seen in the last two centuries, but I do take some things he says with a grain of salt.

I doubt the pilot was truly speechless, unless he was a brand new pilot, we pilots have been hearing how we are going to be obsolete, pretty much since, and probably even before, I got my license. It could happen of course, probably even likely it will.

I am curious as to how many people here who believe in totally driverless cars, and would get in one now without a thought, would also get on a pilotless aircraft?

No judgement from me on this, if you believe they are safe enough that is entirely up to you. I know it's a slightly different proposition in that when things go wrong in aircraft, the consequences can easily be more than a fender bender, but I'm curious.
 
No doubt Elon may have said this, I admire him very much for what he has accomplished, and his vision and commitment,I truly believe he is one of the greatest visionaries we have seen in the last two centuries, but I do take some things he says with a grain of salt.

I doubt the pilot was truly speechless, unless he was a brand new pilot, we pilots have been hearing how we are going to be obsolete, pretty much since, and probably even before, I got my license. It could happen of course, probably even likely it will.

I am curious as to how many people here who believe in totally driverless cars, and would get in one now without a thought, would also get on a pilotless aircraft?

No judgement from me on this, if you believe they are safe enough that is entirely up to you. I know it's a slightly different proposition in that when things go wrong in aircraft, the consequences can easily be more than a fender bender, but I'm curious.
Pilotless aircraft (i.e drones) already exist (General Atomics MQ-1C Gray Eagle - Wikipedia). No geofence! :p
It's a way easier engineering problem. I'd wait to fly in one until they work out all the bugs just like I do for all new planes. I've never been in a 787 or a 737 MAX. I'd be willing to ride in a 787 now but I'm going to hold off a bit on the 737 MAX.
 
This little gem was sent to my DM's on twitter.
It would seem that our friend @verygreen has a real habit of redefining terms... this time it is "FSD Rewrite"
And when Vladimir called him out on it, Vladimir got to experience the "strange way" that Green talks about things.
Follow this thread (up and down, where ever Vladimir Grinshpun replies.... https://twitter.com/VGrinshpun/status/1321174517335601152 )
View attachment 603186


Note what precipitated Vladimir to ask @verygreen point blank "Do you acknowledge the progress in functionality?:
ElW_1dyXUAUkOKx

ElXBBMNX0AEicrz

Together with your previous posts in this thread, it is obvious that you have some kind of personal problem with green (among others in this thread). How is this relevant at all to the past few pages of discussion?
 
Together with your previous posts in this thread, it is obvious that you have some kind of personal problem with green (among others in this thread).
No personal problem with green at all.

My problem is with people directly contradicting what Tesla (Karpathy, Elon, etc) say and then when confronted about try to weasel out on some bs technicality.

Example #1:
  • Karpathy says "we do not use HD maps"
  • Green says, "they use HD maps"
Example #2 is above:
  • Elon says - "FSD rewrite will be out October 20th" -- then proceeds to release to the beta testers which shows a bunch functionality all at once that we did not expect to get all at once.
  • Green says - "Like I said before, this is really not 'the rewrite' I've been waiting for"
If you think that is okay behavior for you, that is fine by me.
That is not okay behavior for me, especially for someone that I looked to for information and would constantly send people to his videos and twitter posts.


About how is it relevant.
Well, these 2 examples are since the FSD rewrite release, and we are in a thread discussing the rewrite release. And his comments are specifically on the FSD rewrite (not some random setting) Seemed like that is enough criteria.

At the end of the day, there is a pattern that is clear, only if you want to see it.
 
Last edited:
Those SAE levels really need to change. To say intent dictates level is a joke. Everyone's goal is level 5. Much more true for Waymo with its many sensors than Tesla with known blind spots.
To me this FSD beta is pretty clearly a prototype Level 5 system. Just because you don't think it will ever be safe enough doesn't change that. I'm not sure what the alternative taxonomy would be.
 

360 degree video from TeslaRaj. Looks like he got the update too. No annoying pretentious commentary this time; he did a much better job just explaining what he was doing and pointing out some problem areas. He should NOT be allowing the car to turn from the wrong lane all the time, though. It also annoys me when people just let the car meander and block lanes “just to see what the car will do”. No, Just override and hit the camera button to send the data to the mothership for Tesla to look over and try to improve on.
 
  • Elon says - "FSD rewrite will be out October 20th" -- then proceeds to release to the beta testers which shows a bunch functionality all at once that we did not expect to get all at once.
  • Green says - "Like I said before, this is really not 'the rewrite' I've been waiting for"
What's wrong with that? He seems to believe that there are more fundamental architecture changes coming.
 
  • Like
Reactions: MrBadger and DDotJ
Realistically capable in the next few years? Realistically possible within a couple of years?
Who would judge that? Tesla could be right.
I was thinking there would be no way they could build the Cybertruck at the announced price but then they announced a plan to reduce battery costs 50% and now I think it's possible. Maybe project Dojo will work?
I'm probably even more skeptical than you.
 
  • Informative
Reactions: pilotSteve
What's wrong with that? He seems to believe that there are more fundamental architecture changes coming.
Please go read the thread, it is all there: https://twitter.com/VGrinshpun/status/1321174517335601152
Tesla on the Q3 earnings call says officially "we have released the FSD rewrite to a small group and will keep expanding."

Green says, that is not FSD rewrite --- it would be FSD rewrite if
A notable change would be when Tesla changes their recommendations to a Waymo-style "do NOT keep your hands on the steering wheel, trust the car"
How is that not re-defining what Tesla released?
Since when is the definition based on what level of user input the solution currently accepts/requires eg. "do not keep your hands on the steering wheel"

Either Tesla is doing what it says it is doing, or they are committing fraud.
 
Last edited:
Who would judge that? Tesla could be right.
I was thinking there would be no way they could build the Cybertruck at the announced price but then they announced a plan to reduce battery costs 50% and now I think it's possible. Maybe project Dojo will work?
I'm probably even more skeptical than you.
I was thinking self selected. I'd be surprised if Elon thinks they can get to level 5 this decade. There are plenty of decisions in driving that require knowledge of the world, that no system except for perhaps remote driving ones, will have the capability of. Will be interesting when level 5 intentioned systems are able to pass on two lane highways, drive on ice, handle snow, know to slow down in heavy rain, forge a stream, etc...
There are signs in the Phoenix area, that say, do not cross when chance of flash flood. When will a level 5 car understand that and other signs that require higher level reasoning?
My point is you need general A.I. to reach level 5 and that isn't happening in the foreseeable future. May not happen for 50 years.

Wikipedia said:
5 Full Automation under all roadway and environmental conditions that can be managed by a human driver

I had no problem accepting Cybertruck price. They are saving money on paint and paint issues. No need to pay for expensive stamping equipment either. When other Tesla vehicles body parts are stamped, there is a large team, getting rid of imperfections. Saw this myself during Tesla factory tour.

I don't consider myself skeptical. I've always said that Tesla will be first with wide release of FSD, even before Waymo. Level 3 and 4 will be very impressive and do much to help the world.
 
Last edited:
I think Tesla is going to have to start having their Nav calculate at least one, preferably two or three, “backup” routes in advance for their FSD in case the car misses a turn or can’t get into a turn lane due to traffic. Every time there is a turn ahead the car should already have a backup route or two calculated so it can seamlessly transition to plan B if necessary. The car should never, ever try to force itself into a packed turning lane (I hate people who do that even when they’re blocking a through lane to do it and screwing with everyone behind them) or try to turn from the wrong lane. Far better to transition to the next option even if it’s to make a U-turn at the next intersection or drive around a block.
 
Green says, that is not FSD rewrite --- it would be FSD rewrite if
How is that not re-defining what Tesla released?
Since when is the definition based on what level of user input the solution currently accepts/requires eg. "do not keep your hands on the steering wheel"

Either Tesla is doing what it says it is doing, or they are committing fraud.
Green didn't say "that is not FSD rewrite". What he did say is that what was released to beta isn't what he is waiting for. There is a difference and people are trying to infer something that isn't reflected in this series of tweets.

Why is this controversial for you? Green is entitled to his own expectations.
 
One thing about the SAE driving automation levels is that I don't think there's any timeliness or comfort aspect.
Ha, just on the drive home today, there was a situation that touches on both timeliness and comfort.

passing.jpg


This is a residential street with 25mph speed limit and dashed yellow center line. Turns out this passing car could not wait behind someone going exactly the speed limit and switched as soon as possible even with an oncoming vehicle that ended up pulling off to the side to allow this person to finish overtaking.

Would Autopilot be L5 if it never overtakes a slow moving vehicle on a two way road? How about on a 2-lane highway if you're "stuck" behind a slow truck that is driving at its speed limit? If it does support overtaking, how much gap would be acceptable and comfortable?

Also, had Autopilot been the oncoming vehicle, what actions would be acceptable or expected? At minimum it could slow down or even stop. But if the vehicle actually needed to pull off the road to prevent a head-on collision, is that behavior required for L5?