Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Musk Touts ‘Quantum Leap” in Full Self-Driving Performance

This site may earn commission on affiliate links.


A “quantum leap” improvement is coming to Tesla’s Autopilot software in six to 10 weeks, Chief Executive Elon Musk said a tweet.

Musk called the new software a “fundamental architectural rewrite, not an incremental tweak.”






Musk said his personal car is running a “bleeding edge alpha build” of the software, which he also mentioned during Tesla’s Q2 earnings. 

“So it’s almost getting to the point where I can go from my house to work with no interventions, despite going through construction and widely varying situations,” Musk said on the earnings call. “So this is why I am very confident about full self-driving functionality being complete by the end of this year, is because I’m literally driving it.”

Tesla’s Full Self-Driving software has been slow to roll out against the company’s promises. Musk previously said a Tesla would drive from Los Angeles to New York using the Full Self Driving feature by the end of 2019. The company didn’t meet that goal. So, it will be interesting to see the state of Autopilot at the end of 2020.

 
Last edited by a moderator:
No, it is not a great point. Lidar is used a lot outside of autonomous driving. Here is a list 100 applications that lidar is used for today:
100 Real-World Applications of LiDAR Technology

So yes, lidar has already proven to be very useful in autonomous driving and outside of autonomous driving.
I will not waste my 6th disagree on you seeing the you got my fifth. I recommend you buy some TSLA - you will see much more clearly.

We are lacking a great deal of data on what Tesla are doing with FSD. When it happens it will blow your mind. FSD is even more important to Elon than batteries. You will see a glimpse of Tesla's power on the 22nd.
 
Entertaining

Two things come readily to mind.
First, there seems to be a lot of people discussing (from an authoritative stance) what one guy is marshaling a company to do. I think I'll just continue to watch what happens as, like landing two boosters side by side, it will either happen or it will not. Either way, we will get an answer.

Second, oh boy, another full and complete re-write which starts yet a whole new bug cycle. I'm still happy with keeping my Gen1 AP order and not canceling and re-ordering to get Gen2 on the last car. When and if Elon pulls this latest rabbit out of his hat, we can get another Tesla (Y perhaps) and play with his latest toy. I'm just happy that some one is shooting for rabbits in this time of glorifying the moron.

It's all good.

Now, back to your regularly scheduled experts debating someone else's efforts.
 
  • Funny
Reactions: avesraggiana
We are lacking a great deal of data on what Tesla are doing with FSD. When it happens it will blow your mind. FSD is even more important to Elon than batteries.

I hope you are right. I sincerely want Tesla to succeed. I was a super fanboy on Autonomy Day. Go back and read my posts in the old Autonomy Day thread, you'll see. I was the biggest "Tesla will solve camera vision" defender back then. Autonomy Day was going to be our moment where Tesla crushes the competition on FSD. And we were going to get "feature complete" by the end of 2019. That never happened. Autonomy Day has turned out to be a big disappointment. So yeah, I am skeptical when fans say "just wait. Tesla's FSD is going to be awesome!". We've heard that since 2016 and so far, no FSD. I will believe it when I see it!

Remember what Tesla promised back in 2016:

"All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you."
 
Last edited:
  • Disagree
Reactions: mikes_fsd
...We are lacking a great deal of data on what Tesla are doing with FSD.
What is your source of data? Are you just a big fan, hopeful, and overly optimistic? I've been inside Tesla and spoken to current and former Tesla autopilot employees. They are very transparent from my perspective. Have you seen Karpathy's presentations?
When it happens it will blow your mind. FSD is even more important to Elon than batteries.
Is this like how Elon said dumb summons will blow our minds? Interesting that you think FSD is more important to Elon than batteries. What makes you say that? Slow going for past 5 years, still at Level 2. When do you think level 3 or higher is going to happen?
You will see a glimpse of Tesla's power on the 22nd.
Please tell us more.
 
Last edited:
I hope you are right. I sincerely want Tesla to succeed. I was a super fanboy on Autonomy Day. Go back and read my posts in the old Autonomy Day thread, you'll see. I was the biggest "Tesla will solve camera vision" defender back then. Autonomy Day was going to be our moment where Tesla crushes the competition on FSD. And we were going to get "feature complete" by the end of 2019. That never happened. Autonomy Day has turned out to be a big disappointment. So yeah, I am skeptical when fans say "just wait. Tesla's FSD is going to be awesome!". We've heard that since 2016 and so far, no FSD. I will believe it when I see it!



I think it's important to remember when Elon told us on 60 minutes-


Elon Musk said:
People should not ascribe to malice that which can easily be explained by stupidity." (LAUGHTER) So-- so it's, like, just because I'm, like, dumb at-- at predicting dates does not mean I am untruthful. I don't know-- I-- we've-- I never made a mass-produced car. How am I supposed to know with precision when it's gonna get done?

Replace "never made a mass-produced car" with "never accomplished L5 self-driving" or "never solved camera vision" if you prefer- and same principle applies.


He keeps throwing out dates that keep blowing by without making the promises for those dates- because has literally no idea how long it'll actually take because he's never done it (and unlike mass producing a car- neither has anybody else)- so any date promise you get from him on something he hasn't already done should be considered in that context.


(now- should he STOP making promises he has no idea if/when he can keep? Sure... but Elon gonna Elon)
 
What is your source of data? Are you just a big fan, hopeful, and overly optimistic? I've been inside Tesla and spoken to current and former Tesla autopilot employees. They are very transparent from my perspective. Have you seen Karpathy's presentations?

Is this like how Elon said dumb summons will blow our minds? Interesting that you think FSD is more important to Elon than batteries. What makes you say that? Slow going for past 5 years, still at Level 2. When do you think level 3 or higher is going to happen?
Please tell us more.
Missing data - how about a video from Elon's last 10 hours of driving. How many NN 4D mini projects are they planning for. How long did the last 5 take. We know very little in terms of being able to predict when they will reach FSD.

Elon is focussed on Level 5 not 3.

22nd is battery day.
 
Missing data - how about a video from Elon's last 10 hours of driving. How many NN 4D mini projects are they planning for. How long did the last 5 take. We know very little in terms of being able to predict when they will reach FSD.

Elon is focussed on Level 5 not 3.

22nd is battery day.
I see. So this means you are just another big fan, that doesn't know, and most amusingly, using lack of data to justify huge optimism.
 
  • Like
Reactions: diplomat33
Is this Tesla answer to not being able to see stop signs at night that are not illuminated?
New-vs-old-Tesla-Model-3-headlights-scaled.jpg

Tesla introduces newly designed headlights and power trunk on the Model 3 [Photos] - Drive Tesla Canada
 
I will not waste my 6th disagree on you seeing the you got my fifth. I recommend you buy some TSLA - you will see much more clearly.

This is what it's all about. Long TSLA. Not a fan, just promoting his portfolio. It's what Elon's been doing for years.

Elon is focussed on Level 5 not 3.

No, Waymo is focused on Level 5. Elon is talking "Level 5" to boost sales and TSLA while his cars are firmly stuck at Level 2.
 
No, it is not a great point. Lidar is used a lot outside of autonomous driving. Here is a list 100 applications that lidar is used for today:
100 Real-World Applications of LiDAR Technology

So yes, lidar has already proven to be very useful in autonomous driving and outside of autonomous driving.

This is a good one... https://levelfivesupplies.com/high-resolution-lidar-ups-the-game-in-localisation-for-autonomous-racing/

"To get an edge in the competition the ARG team opted to pursue a newer approach to localisation: high-resolution LiDAR combined with 3D mapping."
 
  • Like
Reactions: diplomat33
I'd done research on tesla cars, and remain very unimpressed and bitter with the quality, and the service. Hope to rid this pos car as soon as my wife is over it. .(soon enough there will be a 2020 LR+fsd for sale)..I don't see the hype...sorry.
So, in order to help yourself through this "tough" time you thought it would be a great idea to go on a Tesla owners forum and $#!t, I mean, let us know your prejudice?
 
  • Disagree
Reactions: BlindPass
So, in order to help yourself through this "tough" time you thought it would be a great idea to go on a Tesla owners forum and $#!t, I mean, let us know your prejudice?

A forum is not a fan club. A forum is to discuss issues; joys, accolades, problems, and complaints. The materials, fit, and finish of Tesla cars is far below other cars in their price class and it's frustrating. If you're not familiar, go to a BMW dealership and sit in cars in the $40-$80,000 price range. The interior materials, fit, and finish on every one of those cars destroy every Tesla up to $140,000. Clearly we pay for the batteries and R&D but this year we mostly paid for Elon's obscene salary. Materials, fit, finish, service, and customer service should all be better for cars in this price range.

I enjoy driving my Model S, I am happy for how Tesla is pushing flexible energy, and I am long TSLA, but every day I wish I could have either paid less or gotten a higher quality driving experience. I also wish I had realized that the CEO was lying about FSD before I paid for it. Then when I see Elon getting obscene billions while we deal with these shortcomings, it is frustrating.
 
Kona ev ultimate because of it's life time battery warranty, build/material quality (based on my experience with Hyundai cars) , and simple driver assist...nothing pretentious.. leaf sL plus, because of it's price. So that was and still is my thinking in order of priority. I'd done research on tesla cars, and remain very unimpressed and bitter with the quality, and the service. Hope to rid this pos car as soon as my wife is over it. .(soon enough there will be a 2020 LR+fsd for sale)..I don't see the hype...sorry.
Would be interesting if you made a video with your thoughts and optionally a comparison.
 
Is this Tesla answer to not being able to see stop signs at night that are not illuminated?
What are you basing this on?
The HDR cameras can see serious detail even with direct/blinding sunlight, the NN's might not be trained yet to handle the scenario, but I am curious on what you base your opinion of that the car cannot "see".

For instance, green did an experiment where he could not see the traffic light because of the sun but he took snapshots on all AP cameras and you could clearly see the traffic light and the state/color of the traffic light... here is the link to that post from green AP2.0 Cameras: Capabilities and Limitations?
So, it is possible the NN is not yet able to handle stop signs in low light, but has nothing to do with whether or not the cameras can see them.
 
  • Like
Reactions: boonedocks
somewhere
It was about 8+ months ago.
makes sense....

We have concrete evidence that the HDR cameras on our Tesla's are able to get enough detail in pretty extreme conditions (from direct sun to dark outside).
The NN's that are trained for detecting in extreme conditions might not be quite there yet, but at this point there isn't much of a case that the sensors are not capable and thus "need updated headlights to account for <<some FSD case>>"

For a good list of videos you can peruse greens YouTube channel here https://www.youtube.com/user/greentheonly/videos
 
Hi all, I’ve read through everyone’s comments and concerns, and unfortunately I feel there’s a lot of misleading information on both sides of the fence on this situation. In my opinion, I feel the evidence is currently somewhere in the middle.

Firstly, there’s a lot of things I’m going to cover in this post, so please bare with me.

Let’s cover the ‘Phantom Breaking’ first. Yes, this is a real thing which unfortunately still rarely occurs in Tesla’s. However, from my findings, I’ve noticed this usually occurs is specific locations. It leads me to believe it could be a miscalculation between the speed limit detected in the system, and the actual speed limit of the road. It occurs so suddenly and quickly passes, that the vehicle was reactivity to a sudden speed limit decrease in that area, however it was a false reading and then corrects itself. I’ve also noticed this in a very high traffic area, where traffic is backed up 50-60% of the time. Additionally, this area was under construction for nearly 2 years, and the speed limit was lowered via signs. My assumption is the vehicle is slowing down as it is still detecting this area as a construction zone, even though the signs have been removed and the speed limit is no longer reduced. If either of these is the case, I believe this should fixed or on the path to being fixed with the coming updates to detect and read speed limit signs, as the current data in the system is showing a false negative for that location.

Regarding the main thread discussion, a ‘Quantum Leap’. In this regard I do not believe it will be a quantum leap in the actual capabilities of the vehicles. Tesla has always put out updates in incremental steps, and tests features gradually to gather better information and improve accuracy. This situation will likely be no different. I’m not saying you won’t receive a rough ‘feature complete’ version of software on the next update, I just don’t believe it will be a literal quantum leap. Yes, it may be a feature complete version, but there will likely be a lot of interventions required by drivers. That’s okay, there’s nothing wrong with this, as it incrementally progresses closer to the end goal. I feel a lot of people are anticipating a near perfect system with this rewrite and new software, however that will not likely be the case. I could be wrong, but based on previous statements and software updates etc, it’s likely there will be some great new features, but require more driver input than most are expecting with this rewrite. A good example was with one user stating even Waymo has interventions from time to time, simply because the driver is unsatisfied. That’s to be expected, especially with a brand new software rewrite and updated features. So be vigilant out there.
What I believe the Quantum Leap that Elon is referring to, is the data that is being collected and information that will be gathered based on this new rewrite. Instead of the ‘2.5D’ information, this rewrite will allow Tesla to collect date more efficiently and more accurately, in order to submit it into the Neural Network. The network will be able to analyze this data at higher rates, and recognize errors or flaws quicker. However, this doesn’t mean the next update after the rewrite will make drastic improvements. Of course it might, but not based on the Project Dojo timeline. We are anticipating Project Dojo to ramp up over the next year, and for Tesla to have one of the most powerful computers on earth (at this current time), to utilize their neural network and machine learning. Once this system is fully operational, is when we will see the ‘March of 9’s’ start to take place. Until that time, we can only expect a rough Level 3 autonomous vehicle for Tesla’s vision system. Again, I could be wrong, but we know the vision system requires far more data than the traditional vision + radar + lidar etc suite. (Yes I’m aware Tesla uses radar and ultrasonic sensors, but let’s just call it a Vision System for ease. And let’s call anything with Lidar+ just ‘Lidar’). We will likely see some great new features and updates roll out over the next 3-10 months, but until Dojo is fully operational and analyzing data from the upcoming software rewrite, it’s a bit naive to assume Tesla will achieve Level 4 or 5 with just the software rewrite. Thus, I believe the rewrite will allow for the quantum leap to occur, once Dojo is operational.
That being said, I do however believe Level 4 and (based on the NHTSA standard) most Level 5 situations comparable to current human capability. To break it down, the argument that human drivers currently operate on a vision system, and a vehicle should be able to do so too, is a true statement. One can always make the argument of neural nets not as accurate as human brains, or technology might not be as good as a human. However you would like to state these arguments, they ignore the progress of technology. We currently have camera technology that sees better than humans, and computers that function at greater capacities than us for dedicated tasks. Self driving is a dedicated task, and we are not also teaching the car to write a book, play jump rope, or know how to grow a garden. A computer with a dedicated task can certainly out perform a human. I’m not stating that these cameras or computers are currently in Tesla’s, only that it’s inaccurate to assume a vision system will never work for self driving. Especially with 360° cameras compared to a human head with 2 eyes on a swivel. Additionally, we have seen how Autopilot does improve the safety of the the vehicle, and allows it to drive on the highway at arguably Level 3 if you remove the steering wheel nag. But with this current system, it assists in showing the problem is more software based. Highway driving is less demanding and an easier problem to solve, but that’s why it’s available first compared to city driving with Tesla’s system. They are working their way to more difficult problems, and city driving with more specific lanes, cyclists, pedestrians, etc make it far more difficult than highway. However more difficult, not necessarily impossible with a vision system. It could easily only be better software required. At this time, it is my opinion that the current hardware in Tesla’s will be capable of full self driving, to the degree of a perfectly accurate and always 99.999...% functioning human being. I cannot expect a vision system to see much more beyond what I can, but that is how our current driving world is designed anyway. I mentioned Tesla will be capable of self drive at Level 5 in most situations (based on the NHTSA standards), as long as those situations are comparable to current human driving conditions. Snow, rain, etc. To state ‘All Conditions’, I’m anticipating that means driving in a white-out blizzard or similar situation where a human is best to stop. I do not see a Tesla exceeding conditions such as that. This may one day be possible, but I believe different hardware would be required for driving in situations where a human cannot visibly see anything. Again, I cannot expect a vision system to see much more than what a human is capable of seeing.
Solving the self driving problem is much more difficult when only using a vision system. As one user mentioned, it’s like teaching a human to see, and learn everything about the crazy visual world and driving at the same time. It takes time, and a lot of data. A Lidar system is more reliable and easier to develop. We’ve seen this with Waymo. Yes, Waymo does have Level 4 Self Driving vehicles. That is a fact which some are trying to argue isn’t real. Because none of us have experienced self driving, doesn’t mean Waymo or even Tesla doesn’t have a system capable of that, but based on the data on the road and what is taking place, Waymo has Level 4 autonomy in very limited amounts available. Of course you can’t buy any vehicle with full self drive, so let’s leave that out of the conversation. Also, let’s leave prices out of the conversation as well, because we all know advances in technology and increases in production and availability brings costs down. Now before I get too far ahead, it’s important mention that in 2018 Waymo’s vehicles with safety drivers had been involved in dozens of accidents, but mostly at low speeds, and most of the autonomous miles collected by Waymo had occurred with a safety driver in place. It doesn’t mean the vehicle was at fault in these collisions, as it could have been a human taking over and causing an incedent as well. The one thing that concerns me is, I hope Waymo has far more than only 10 million miles collected in data. For a company operating in more than 25 cities, with some systems running 24/7, I would hope it’s 5-10x that. I could be wrong with my information on Waymo, just basing it off the limited research I’ve done and what I’ve read here. However, I would be curious to see updated data for Waymo, for collisions, take over rate, and percentages driven with safety drivers and without. Nonetheless, Waymo is out there self driving right now, and the reason for that has to do with Lidar, and not the amount of data they have. Lidar makes the system a lot more accurate, and certainly requires a lot less data. Instead of teaching the car how to see visual in the real world and base everything off that, lidar adds an extra sense and allows the vehicle to reach out and touch things. It’s easier to train something that has more capabilities, and it allows for more redundancies. Basing everything only off a vision system is far more complex, which is additionally why Tesla is not as far as Waymo when it comes to self driving. Arguing that Tesla doesn’t have extra redundancies is fair, but it’s silly to ask what would happen if a camera failed on a Tesla. At the same time, what would happen if a lidar system failed on a Waymo? I assume the car would slow down, and try to pull to the side of the road with it’s hazzards on, if it can do so safely. Ultimately it will stop the car in the safest way possible. Redundancies are great, but you also have to anticipate failure in everything. Thus the vehicles should do its best to keep any occupants and nearby drivers safe as possible.
Basing everything on vision is a different approach to Lidar, and certain scenarios work better with vision. For a quick example, being able to identify a small tire track trail to someone’s country house would require a vision system to identify it, and not HD maps or lidar. Of course a Lidar system which also has vision could recognize this, but it’s unlnown if the companies using Lidar are working on this problem, or if they are focused on other scenarios. This is a scenario Tesla is/will be focused on, as it’s considered a driving path and will be driveable especially if it’s to an owners house. I believe it’s situations like these why we see Waymo mapped to specific areas, because sometimes out beyond the traditional city, the real world can get even more weird. (I actually know someone who has a small dirt path such as this leading to their house, and has a thin wooden bridge which goes over a set of train tracks so you can always safely cross to get to their house... Talk about weird). These are unique scenarios which Tesla hopes to have greater success with based on their vision system. They are both trying to solve the same problem, but approaching it in similar yet different ways. One has so far lead to a quick jump ahead, but the progress since has been minimal, whereas the other has been slow and tedious. I believe once Project Dojo is operational sometimes mid to late next year, is when we will really see the finite accuracies come to fruition from Tesla’s approach. Until then, this rewrite will give us some new features and capability to get there, but they will still require intervention at a typical rate. Again, I could be wrong, and one could argue a rough version of ‘feature complete’ is a quantum leap compared to what we have now, but it’s a bit presumptive to assume it will jump from level 2 to level 4 soon after this rewrite. We may get a lot of amazing features all at once, but not without staying vigilant and keeping those hands on the wheel for a nag. The quantum leap will likely be when Dojo is able to analyze the rewrites data and information, and accurately carve out a better picture of our wacky world. Take it with a grain of salt, but I think we’ll see the start of the March of 9’s sometime later in 2021. And if the Dojo Project is the Quantum Leap capability, I think we would quickly see it improve from there, in all real world cases.