Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What will happen within the next 6 1/2 weeks?

Which new FSD features will be released by end of year and to whom?

  • None - on Jan 1 'later this year' will simply become end of 2020!

    Votes: 106 55.5%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 3.0 vehicles.

    Votes: 55 28.8%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 2.x/3.0 vehicles.

    Votes: 7 3.7%
  • One or more major features (stop lights and/or turns) to all HW 3.0 FSD owners!

    Votes: 8 4.2%
  • One or more major features (stop lights and/or turns) to all FSD owners!

    Votes: 15 7.9%

  • Total voters
    191
This site may earn commission on affiliate links.
It would have been the easiest thing back in 2016 and again 2019 for Elon to not use Level 5 when describing AP2/HW3 and use some other wording instead. These were said at major Tesla events respectively, streamed to the world to hear.

It would have been the easiest thing back in January 2017 for him to not to tweet FSD differentiating features are definitely coming in 6 months, maybe even in 3 months, but he did. If he learned otherwise, he could have retracted promptly, but never did.

Same with the sleeping in the car, the coast to coast drive promised for 2017. He made these announcements, never retracted them, got the massive PR for it, people paid good money for consumer products based on that, and now Tesla needs to deliver what has been sold.
 
  • Like
  • Love
Reactions: Matias and DanCar
Tesla is giving some EA access to the traffic light response feature that actually stops the car:

3rd Row Tesla on Twitter

"So the big surprise this week is this:

Even though only a select few Early Access users will have the code that actually stops the car enabled...

EVERYONE with Hardware 3 gets to see the full self driving visualization."

------------

So EA is getting more than just the visualizations!

I think this is a good indicator that Tesla is moving closer to rolling AP3 "feature complete".
 
... Unless he was not truthful of course.
Yes, he has not been truthful on these statements for many years. In fact he has been wrong with every prediction for driverless that is wrong or right. One should expect the past to repeat itself. He is way overly optimistic. In the end, history won't remember much of the bumps, bruises, and blood. They will just remember that an incredible technology was delivered, that no one else would dare to do, a little late, but that is typical with break through tech.

Agree those that paid on the failed dream are due compensation.
 
Last edited:
If they retrofit it, that is of course one thing, and that would be a mitigating factor.

But Elon is on record as announcing Level 5 no geofence feature complete for HW3 by end of this year and robotaxis next year on HW3. That must mean the hardware can do FSD.

Unless he was not truthful of course.

Or he was telling the truth as he believed it, and he was wrong.

He needs to sit down with his programmers and let them tell him how difficult it really is to build a driverless car. Then he needs to sit down with his legal department and let them tell him the legal consequences of making promises his programmers cannot keep. Then he needs to sit down with his engineers and his programmers and work out a realistic time line for autonomy: Feature A in beta by this date, and city NoA at Level 3 by this other date, and a driverless car by year X, etc.
 
He needs to sit down with his programmers and let them tell him how difficult it really is to build a driverless car. Then he needs to sit down with his legal department and let them tell him the legal consequences of making promises his programmers cannot keep. Then he needs to sit down with his engineers and his programmers and work out a realistic time line for autonomy: Feature A in beta by this date, and city NoA at Level 3 by this other date, and a driverless car by year X, etc.
If he was interested in that he would have done it after the first few failed predictions. But instead he keeps doubling down on it. Why? Because he has learned that the hype sells cars and drives up the share price. This is quintessential Silicon Valley.
 
  • Like
Reactions: Matias
... In the end, history won't remember much of the bumps, bruises, and blood. They will just remember that an incredible technology was delivered, that no one else would dare to do, a little late, but that is typical with break through tech.

Except that automotive autonomy is a field that many companies are working on, and Tesla is probably not the leader. That would be Waymo.

"... no one else would dare..." is way off the mark. Tesla introduced the Roadster when no one else would dare to make an electric car that was not a golf cart. Today, pretty much everyone is daring to try to develop a driverless car.

Tesla is, however, the only company daring to take people's money under a contractual obligation to deliver certain features that it keeps failing to fulfill, and Tesla is the only company daring to call a set of Level 2 features "FSD," or to use the term "feature complete" in its strictest technical meaning, that unreliable alpha code is going to be tested, when speaking to a public who does not understand the jargon and can only imagine that he's saying "driverless car" when he says "feature complete."

Tesla is today making the best cars on the road. But it's promising years early that they'll be capable of driverless operation.
 
If he was interested in that he would have done it after the first few failed predictions. But instead he keeps doubling down on it. Why? Because he has learned that the hype sells cars and drives up the share price. This is quintessential Silicon Valley.

It used to be that when you told a lie the best way to handle it long term was to admit fault, and make corrective action.

But, in recent years it seems like denying you've lied, and continuing to deny it has become mainstream. Even to the point where new terms like alternative truth started to become common place. It got someone with zero political experience the job of being President. So obviously it's not a bad strategy.

The other thing to consider is large scale autonomous driving in itself is a lie. It's going to require a significant about of momentum to force it. To do this date no one is doing large scale full-self driving or even Level 3 driving. They're not because they can't break through all the barriers (regulatory, liability, etc).

What better way to create momentum then to pretend to do it? Tesla has every intention of doing full self-driving, and there is no denying that. The only question is how far the NHTSA will allow them to go before going "Okay, guys. This is getting too dangerous and you need to stop making your owners responsible for city NoA".

I think in 2020 we're going to find out just how far the NHTSA lets them go.

The European regulators already put restrictions that Tesla is trying to get them to undo.

As soon as the regulators crack down then Tesla will blame the regulators for failing to achieve FSD.

This means that Elon will never own up to the biggest lie of them all. The biggest lie wasn't the timetable, but that the sensor suite in HW2/HW2.5/HW3 could even do it.

I wouldn't be the least bit surprised if the greatest lie ever told in the automotive world was also the most necessary lie of them all.
 
Except that automotive autonomy is a field that many companies are working on, and Tesla is probably not the leader. That would be Waymo.
How many driverless cars will you be able to buy? Will you be able to own a waymo? Tesla is the only one in this field at present.
"... no one else would dare..." is way off the mark.
Same as previous comment. Tesla is the only company bringing driverless to your garage.

Tesla is the only company daring to call a set of Level 2 features "FSD,"
Yep, Tesla has it's faults, just like every other company. Tesla would be better off calling it PSD.
or to use the term "feature complete" in its strictest technical meaning, that unreliable alpha code is going to be tested, when speaking to a public who does not understand the jargon and can only imagine that he's saying "driverless car" when he says "feature complete."
I think people are a little brighter than you think. The people I know, just don't believe it, and are very skeptical. Tesla can say what they want, but people don't put faith in it, until it is delivered.

Just because there is a wikipedia page about feature complete, doesn't make it a strict technical meaning. I have more years in the software engineering business than I care to admit to. Feature complete to me, means a rookie coder, coding something he is hopeful will work, but always never does.

But it's promising years early that they'll be capable of driverless operation.
Yes, Elon has consistently made this mistake. The Model X was several years late.
 
Funny you mention it because GM has actually petitioned the NHTSA for permission to deploy 2,500 Bolt robotaxis on public roads:
GM pushes feds to approve Chevy Bolts with no steering wheel - Electrek

Yeah, I saw that and I'm really curious to see what happens as hopefully it sets precedence for other manufactures.

Now it is a bit difference as that's a for a Level 4 Robo-Taxi where Tesla is trying to turn a Level 2 vehicle into a Level 4 vehicle.

I don't know how far the NHTSA will allow Tesla to go with Level 2 before requiring them to get similar approval for Level 4.

I'm also really curious to see how wide of areas, and weather conditions the Bolts are allowed to operate.
 
If he was interested in that he would have done it after the first few failed predictions. But instead he keeps doubling down on it. Why? Because he has learned that the hype sells cars and drives up the share price. This is quintessential Silicon Valley.

I see two possible reasons:

1. He is telling the truth and Tesla is far more advanced than cynics expect when it comes to autonomous driving.

2. He is indeed doing what you say i.e. lying for personal / business gain.

We don’t know which one.

I’m not bying the third option that he just doesn’t know any better by now. That could have explained the first mistakes in 2014 or 2016 with early Autopilot projects but not 2019 Autonomy Investor Day.
 
  • Like
Reactions: pilotSteve
I see many parallels between waymo and tesla. Both truly believed a few years ago, that driverless would come the next year. The following year both truly believed the same of next year again and both failed. Now both are saying the same thing: next year will be the year. :) You know what you should believe: nope. I've spoken to people at Waymo. I know people have left. I know they have many engineer openings. This is hearsay and exaggerated: Waymo originally wrote the software for maximum safety. A couple of years ago, they decided that wouldn't work. They had to take risks. They decided all the software had to be rewritten because maximum safety wouldn't work. Still today, the culture I perceive at waymo is one of risk aversion. They won't risk spilling blood. Because I believe that spilling blood is necessary, I don't see how waymo can succeed. They will continue to have marketing stunts, but since there is a danger, there will never be a wide release. That could change quickly if venture capitalists took over the company. I hope I'm very wrong and Waymo widely succeeds soon.
 
Now it is a bit difference as that's a for a Level 4 Robo-Taxi where Tesla is trying to turn a Level 2 vehicle into a Level 4 vehicle.

Tesla is aiming to make FSD self-driving in the entire continental US, with no geofencing, on all roads, day and night and in all weather. By definition, that's actually L5, not L4! So Tesla is aiming to go from L2 to L5 in the continental US. It would be L4 if Tesla decided to restrict FSD to only certain roads or only certain conditions in order to make the FSD work.

I don't know how far the NHTSA will allow Tesla to go with Level 2 before requiring them to get similar approval for Level 4.

As long as Tesla requires the driver to pay attention and keeps the nags as the means to ensure driver attention, I imagine the NHTSA will give Tesla a lot of leeway. Driver attention is a nice crutch because if your system is not good enough, you can always fall back on the driver should pay attention and there are nags to ensure they do. However, to be certified as autonomous, L3 and above, I would imagine that NHTSA would require Tesla prove that FSD meets some standard of safety. If Tesla cannot prove that FSD meets that standard of safety as an autonomous system then the NHTSA would presumably require driver attention.

The tricky part for Tesla is that they are going for L5 because they are developing FSD to work on thousands of cars spread all over the continental US, on all roads, day and night, etc... This means that Tesla has to show that FSD is reliable enough, not just in one area, but in the entire US, on all roads and all conditions. That's a very high bar to meet. Presumably, Tesla can use disengagement rate from AP to show this but they still need to get the disengagement rate of the fleet in the entire continental US good enough.

That is one advantage that companies like Cruise and Waymo have. By only going for L4 geofenced to a small urban area, they don't need to show that their robotaxis can work safely everywhere, they only need to show that their robotaxis work in a particular geofenced area. It's a lower bar.

I'm also really curious to see how wide of areas, and weather conditions the Bolts are allowed to operate.

I imagine the Bolts will be tightly geofenced to certain areas where the system has been fully tested. So, if Cruise has disengagement data and safety data to show that their robotaxis are reliable enough in a particular geofenced area, then the NHTSA could give permission to operate only in that geofenced area.
 
Tesla is aiming to make FSD self-driving in the entire continental US, with no geofencing, on all roads, day and night and in all weather. By definition, that's actually L5, not L4! ....
I find the argument silly. You have to walk before you can run. We will be impressed when Tesla can reach level 3. They won't be able to do level 4 before level 3. And they won't be able to do level 5 before level 4. Level 5 is 10 years away, when it can work in all conditions like snow, ice, dust storms, etc... We will be at hardware 6 before we get to level 5.
 
Tesla is aiming to make FSD self-driving in the entire continental US, with no geofencing, on all roads, day and night and in all weather. By definition, that's actually L5, not L4! So Tesla is aiming to go from L2 to L5 in the continental US. It would be L4 if Tesla decided to restrict FSD to only certain roads or only certain conditions in order to make the FSD work.

As you probably well aware of I throw away Level 5 because not even humans can do that. Everyone has roads, and conditions that they limit their driving to.

Plus Tesla can't even do all weather or even mostly all weather right now with Level 2. As an example just the other day my Tesla told me it couldn't do NoA because of poor weather. I thought it was pretty cool that it knew things we're kinda crappy. But, they weren't that bad. Just some atmospheric river cloud is all. :p

As long as Tesla requires the driver to pay attention and keeps the nags as the means to ensure driver attention, I imagine the NHTSA will give Tesla a lot of leeway. Driver attention is a nice crutch because if your system is not good enough, you can always fall back on the driver should pay attention and there are nags to ensure they do.

I'm not sure that's going to work for City NoA. The issue is how fast situations arise while driving in the city versus the delay in a humans ability to re-engage with the situation at hand. With highway NoA you typically have room to correct for any mishaps or mode uncertainties (The who's in control aspect). Even with a highly controlled environment it still resulted in numerous accidents as a result of the human failing to oversee the machine. But, now it's going to be used in a highly uncontrolled environment where pedestrians and drivers are highly unpredictable.

I think we have to think of it as a grand experiment. We don't know how the experiment will go, but there is ample evidence to suggest the experiment can go horribly wrong.
 
Done logically, which is probably a safe assumption, would mean there's a high probability the data is being sampled only when it is relevant to a pending or higher priority challenge to solve. Like coming across the infrequent animals native to my locale. Or that one spot where AP always disengages, etc..
Regardless, I would certainly call even 10-50TB of data per day as massive...

I guess what I'm saying is that there is very strong evidence that they are specifically not doing that. Several people inspecting the firmware agree that there is no ongoing task to collect things the computer doesn't properly categorize, nor is the computer apparently flagging instances where you an AP disagree. This may have changed very recently, but that seems to be the consensus as of this summer.
 
As you probably well aware of I throw away Level 5 because not even humans can do that. Everyone has roads, and conditions that they limit their driving to.

You are changing the definitions to make human drivers only L4. That's not what the SAE levels mean. You misunderstand what the levels mean. L5 is defined as having the same ODD as humans.

From the SAE document:

“Unconditional/not ODD-specific” means that the ADS can operate the vehicle under all driver-manageable road conditions within its region of the world. This means, for example, that there are no design-based weather, time-of-day, or geographical restrictions on where and when the ADS can operate the vehicle. However, there may be conditions not manageable by a driver in which the ADS would also be unable to complete a given trip (e.g., white-out snow storm, flooded roads, glare ice, etc.) until or unless the adverse conditions clear. At the onset of such unmanageable conditions the ADS would perform the DDT fallback to achieve a minimal risk condition (e.g., by pulling over to the side of the road and waiting for the conditions to change)."

Plus Tesla can't even do all weather or even mostly all weather right now with Level 2. As an example just the other day my Tesla told me it couldn't do NoA because of poor weather. I thought it was pretty cool that it knew things we're kinda crappy. But, they weren't that bad. Just some atmospheric river cloud is all. :p

FYI, Tesla does not need to do all weather conditions to be L5. It just needs to do all weather that humans can drive in. Having said that, if the hardware can't even do weather conditions that humans can do like in your example, then Tesla would be forced to forget L5 and drop down to L4.
 
  • Informative
Reactions: pilotSteve
Still today, the culture I perceive at waymo is one of risk aversion. They won't risk spilling blood. Because I believe that spilling blood is necessary, I don't see how waymo can succeed.
Takings risks and "spilling blood" is easy as long as you don't have to assume liability. That works with driver assistance systems (where you can always blame the driver), but not with full autonomy.
 
  • Like
Reactions: Matias