Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon: "Feature complete for full self driving this year"

This site may earn commission on affiliate links.
Ok, I confess. The only FSD improvement I expect this summer is “enhanced summon”. All the other FSD skills, IMO, are likely to require HW3 with its full resolution cameras. I’m not even expecting street sign recognition with the old computer in HW2.5.

So I think we need to await HW3 upgrade schedules. Note that Teslafi reports only a few score HW3 members to date. TeslaFi.com Firmware Tracker

It seems like there should be more, May Model 3 production should have been 24,000 vehicles, maybe 30,000 with Model S and Model X. Not everyone gets Teslafi, but less than 100?
 
Ok, I confess. The only FSD improvement I expect this summer is “enhanced summon”. All the other FSD skills, IMO, are likely to require HW3 with its full resolution cameras. I’m not even expecting street sign recognition with the old computer in HW2.5.

I am a bit more optimistic than you. I think we will also get "traffic light recognition" and a basic "automatic driving on city streets" this year. Those features are promised this year. So I think Tesla will try their best to deliver something this year but it may be a basic version and not the full feature right away, similar to how we got NOA with confirmation first and NOA without confirmation later. So Tesla may give us a basic version of "automatic driving on city streets" this year and a more developed version next year. Like with NOA, this would allow Tesla to incrementally validate "automatic driving on city streets" with fleet data and help them improve it. I do think that we need to wait for AP3 in order to get the best version.
 
  • Like
Reactions: JeffnReno
Tesla expects the driver to be vigilant and take over even if the car doesn't request intervention.
Tesla doesn't define states of autonomous driving. Look, it's an arbitrary number. If you want to call it zero, call it zero. Telsa is NOT advertising it as some number. Tesla IS saying they, shortly, will have a fleet of level 5 cars in service, revenue generating and owners can place they teslas either in service in the Tesla Network or out of service. The rest is arguing over how many angels can dance on the head of a pin.
 
Tesla so far requires supervision making them Level 2.
Whatever you say boss. I'd say they very sporadically require intervention. This is a silly argument, "sound and fury signifying nothing" as the saying goes. When it asks you to take over or you get the 3 loud beeps, it's generally too late. But, again, I am done with this thread. Tesla is not claiming ANY SAE level so does that make it zero? Who knows, who cares. Argue on comrades. Just know you're doing it without me.
 
  • Love
Reactions: OPRCE
Whatever you say boss. I'd say they very sporadically require intervention. This is a silly argument, "sound and fury signifying nothing" as the saying goes. When it asks you to take over or you get the 3 loud beeps, it's generally too late. But, again, I am done with this thread. Tesla is not claiming ANY SAE level so does that make it zero? Who knows, who cares. Argue on comrades. Just know you're doing it without me.

No, it is not too late because the car needs to be designed to handle everything, that is what makes Level 3 so hard compared to Level 2. This is very clearly laid out in the J3016 paper that defines these levels. SAE Level 3 requires that it gives a pre-specified amount of time for the request to take over, so that the driver is not required to supervise the drive at all while in Level 3 autonomous mode. Enough so that they can for example read a book, something like 10 seconds or so.

Until the driver takes over, Level 3 also specifies that the car remains in control, it can not drop control even after those 10 or so seconds just like that, it has to stop the car. It can do it in-lane on Level 3 (on Level 4-5 it needs to achieve a minimal risk condition so in-lane stopping is not okay there).

For Tesla to reach Level 3 they would have to remove the requirement to supervise the drive by looking at the road and instead replace it with a timed request to take over in case taking over is needed — and maintain car responsible drive until driver takes control or come to a controlled halt.
 
@diplomat33 Doesn’t Tesla call it ”Automatic driving on city streets”, where does the ”Automatic City Driving” come from?
What about suburban streets? Yeah, I don't understand that distinction except perhaps NoA city means can navigate down town Manhattan. Sure, if it can do that I trust it can do rural and suburban areas. It pretty much works fine for me in my neck of the woods except for reading speed limit signs, which often don't match geocoded speed limits, stop signs, left or right turns at intersections, be there a stop sign, light, or yield sign.
As an aside, it is SAE autonomous level convo I am off, not definition of feature complete which I have familiarity with given 45+ years in software development. I would have thought NoA city would not exist as when they reach that drop NoA completely and say FSD. But then AP includes NoA but won't include NoA city, so perhaps that is the distinction.
 
For Tesla to reach Level 3 they would have to remove the requirement to supervise the drive by looking at the road and instead replace it with a timed request to take over in case taking over is needed — and maintain car responsible drive until driver takes control or come to a controlled halt.
Yes, I know, I am violating my own stmt. I would posit, the requirement to supervise is more an anti-litigation device than a, "hey, SAE says you must be prepared to take over in the event of...". I would guess every family of someone who died in a Tesla sued Tesla. Without that stated requirement they'd lose every suit. I don't care which SAE level they are at, I use EAP to the max I can, even on roads with no center line. You want to advocate it's SAE 2 fine. This thread is about feature complete. OK, with that, @wcorey out.
 
Yes, I know, I am violating my own stmt. I would posit, the requirement to supervise is more an anti-litigation device than a, "hey, SAE says you must be prepared to take over in the event of...". I would guess every family of someone who died in a Tesla sued Tesla. Without that stated requirement they'd lose every suit. I don't care which SAE level they are at, I use EAP to the max I can, even on roads with no center line. You want to advocate it's SAE 2 fine. This thread is about feature complete. OK, with that, @wcorey out.

I don’t really need to advocate anything regarding the SAE levels. :) I am familiar with them having read the document and I was just doing my bit in providing some info.
 
Last edited:
This is not important, but just to clarify, EAP included NoA but the new AP option does not include it. It is now a part of the ”new FSD”.
And that's weird, in and of itself, I think because didn't they grandfather it at some point where people ordering AP were grandfathered for NoA? It's more confusing than it should be and I'd chalk that up to poor mgmt or poor communications, which generally is poor management. I read them too. Level 0 and 5 are, I believe, unequivocal ... well except for "under all roadway and environmental conditions that can be managed by a human driver". Would that be a 16 yr old human driver or a 50 yr old professional driver, i.e. truck driver or taxi driver or Nascar driver? I meant that to be humorous but goes to the background of my point. Except for level 0 there is a degree of interpretation required it seems.
 
No, it is not too late because the car needs to be designed to handle everything, that is what makes Level 3 so hard compared to Level 2. This is very clearly laid out in the J3016 paper that defines these levels. SAE Level 3 requires that it gives a pre-specified amount of time for the request to take over, so that the driver is not required to supervise the drive at all while in Level 3 autonomous mode. Enough so that they can for example read a book, something like 10 seconds or so.

Until the driver takes over, Level 3 also specifies that the car remains in control, it can not drop control even after those 10 or so seconds just like that, it has to stop the car. It can do it in-lane on Level 3 (on Level 4-5 it needs to achieve a minimal risk condition so in-lane stopping is not okay there).

For Tesla to reach Level 3 they would have to remove the requirement to supervise the drive by looking at the road and instead replace it with a timed request to take over in case taking over is needed — and maintain car responsible drive until driver takes control or come to a controlled halt.
Tl;dr; cloud talk: level 3 is dumb (and or I'm ignorant)

So is level 3 the Halting problem of self driving? It needs to know ahead of time that it will be in a situation it can't handle, thus showing it knows what it doesn't know (so why can't it know)?
In other words a level 3 system is a level 4 system in some places and a level 2 in others and has highly defined differentiators between the two. In that regards, it seems level 3 is no different than level 4 since 4 is delineated from 5 by the inability to handle certain locals/ situations. If a 3 is a 4 with dynamic limits, it seens impossible to have a 10 second predictor of limit violating events.

So, without nags, is NoA a level 4 with the limits of freeways, or is it a 3 since it alerts you that your exit is coming up and it will be dropping to AP/ level 2?
 
  • Like
Reactions: OPRCE
What about suburban streets? Yeah, I don't understand that distinction except perhaps NoA city means can navigate down town Manhattan. Sure, if it can do that I trust it can do rural and suburban areas. It pretty much works fine for me in my neck of the woods except for reading speed limit signs, which often don't match geocoded speed limits, stop signs, left or right turns at intersections, be there a stop sign, light, or yield sign.

I think suburban streets will be included in "automatic driving on city streets".

As an aside, it is SAE autonomous level convo I am off, not definition of feature complete which I have familiarity with given 45+ years in software development.

Honestly, defining the level of autonomy for Autopilot may become increasingly blurry and difficult. It is likely in my opinion that as Tesla keeps adding new features with driver supervision, that we will see Autopilot become more and more like FSD while still technically being L2 because of the driver supervision. In other words, Autopilot may be L2 all the way until it suddenly becomes L5.

Look at the progression so far with AP:

Auto Steer + TACC is textbook L2. The SAE even uses Lane Keeping (Auto Steer) and Adaptive Cruise Control (TACC) has examples of L2.

NOA starts to take AP a little bit beyond classic L2 in terms of features. NOA can decide to make a lane change based on the driving environment and execute that lane change on its own as long as the driver has their hands on the wheel. That's a bit more than just a classic L2 that just helps with lane keeping and speed control. Yet, NOA still has nags and requires the driver to pay attention (watch for that stalled car!) So it has functionality beyond a basic L2 ADAS but is still L2.

Now what about when "automatic driving on city streets" is released? Like NOA, it will presumably be able to decide to make lane changes, respond to traffic lights and stop signs, make turns at intersections based on nav directions. It will start to look a lot like self-driving. It will definitely be doing more than your basic L2 ADAS. Yet, it will require nags and driver attention at first. So it will still technically be L2 even though it is doing all the driving with hopefully no driver intervention.

The final stage will be when Tesla has validated the system with millions of miles, perfected the features, made sure it can handle issues like stalled cars and other issues safely. At that point, Tesla will remove the nags and deploy robotaxis. So a system that acted like FSD but was labelled L2 because of nags will suddenly be L5 when the nags are removed.

At least, that is how see the deployment of FSD features to be like.

I would have thought NoA city would not exist as when they reach that drop NoA completely and say FSD. But then AP includes NoA but won't include NoA city, so perhaps that is the distinction.

Yes, I think it would make sense if Tesla simply replaced the words "navigate on autopilot" with the words "full self-driving" in the blue box in the driver display when FSD has reached L5. From an UI perspective, that would make sense. It would be a convenient way of telling the driver that they are now in FSD mode. The UI would still be the same. You would still have the single blue line like NOA and a blue box in the navigation to toggle it on or off. When I talk about NOA on city streets, I am really talking about the UI. It seems to me like Tesla designed the NOA UI like the blue center line and the blue box in the navigation that is automatically toggled on as the bridge between AP and FSD. I think we can see the NOA UI design was intended to pave the way for the UI for FSD. Another data point to support this idea is the fact that Tesla moved NOA into the "new FSD" column.
 
Tl;dr; cloud talk: level 3 is dumb (and or I'm ignorant)

So is level 3 the Halting problem of self driving? It needs to know ahead of time that it will be in a situation it can't handle, thus showing it knows what it doesn't know (so why can't it know)?
In other words a level 3 system is a level 4 system in some places and a level 2 in others and has highly defined differentiators between the two. In that regards, it seems level 3 is no different than level 4 since 4 is delineated from 5 by the inability to handle certain locals/ situations. If a 3 is a 4 with dynamic limits, it seens impossible to have a 10 second predictor of limit violating events.

I think Level 3 is dumb too. It's basically a weird hybrid of not self-driving and self-driving. On one hand, it is self-driving because the car is doing 100% of the driving. But on the other hand, it may ask the human driver to take over in some instances. That's why pretty much everybody doing self-driving intends to skip L3 and just go straight to L4/5. If you are going to bother doing self-driving, it makes sense just to go all the way and make a system that is full self-driving where the driver never needs to take over.

And you are right that L3 adds an extra layer of difficulty because the system not just has to self-drive, it also needs to know in advance when it won't be able to self-drive anymore. In some instances, it is not always possible to know that. For example, if you have an emergency like a car suddenly stalls in the middle of the highway, there is no time to warn the driver. In that instance, the self-driving car needs to be able to respond on its own without bothering the driver. So you might as well just go straight to L4/5.

So, without nags, is NoA a level 4 with the limits of freeways, or is it a 3 since it alerts you that your exit is coming up and it will be dropping to AP/ level 2?

I would say that if NOA can truly handle all highway driving without the driver (no nags) but it notifies the driver in advance to take over when it reaches the end, like it does now with off ramps, then I would call that L3. SAE defines L3 as true self-driving but where the system can request the driver take over. So I think that would fit with L3.

If Tesla expands NOA to include non-highways so that NOA does not need to disengage at an off ramp, but is able to keep going on the local road, then that would be L4/5. That is what Tesla is aiming for.
 
  • Like
Reactions: mongo
So is level 3 the Halting problem of self driving? It needs to know ahead of time that it will be in a situation it can't handle, thus showing it knows what it doesn't know (so why can't it know)?
In other words a level 3 system is a level 4 system in some places and a level 2 in others and has highly defined differentiators between the two. In that regards, it seems level 3 is no different than level 4 since 4 is delineated from 5 by the inability to handle certain locals/ situations. If a 3 is a 4 with dynamic limits, it seens impossible to have a 10 second predictor of limit violating events.
I'm not convinced that a level 3 system is impossible. Audi was (is?) claiming to be about to release a level 3 system capable of driving on divided highways below 37mph (Traffic Jam Pilot). In such a system the car could simply limit the speed to 37mph or stop in the lane until the human driver is ready to take over.
 
I'm not convinced that a level 3 system is impossible. Audi was (is?) claiming to be about to release a level 3 system capable of driving on divided highways below 37mph (Traffic Jam Pilot). In such a system the car could simply limit the speed to 37mph or stop in the lane until the human driver is ready to take over.

Oh, I'm not saying it's impossible, only that the category seems redundant within the space where it is possible.
On a 70 MPH road a level 3 system with 10 second warning needs to know conditions 1,000 feet down the road. That's where it loses me, unless it is level 4 on highway and alerts you to your exit.

Traffic Jam Pilot sounds like it could also be called a sub 37MPH highway level 4 system.
 
Since we are debating the levels of autonomy, a little refresher might be in order:

j3016-levels-of-automation-image.png
 
Oh, I'm not saying it's impossible, only that the category seems redundant within the space where it is possible.
On a 70 MPH road a level 3 system with 10 second warning needs to know conditions 1,000 feet down the road. That's where it loses me, unless it is level 4 on highway and alerts you to your exit.

Traffic Jam Pilot sounds like it could also be called a sub 37MPH highway level 4 system.
That's why Audi is limiting the system to 37mph. A 70mph system could simply stop the car in a couple hundred feet though that might not be safe since it could result in getting rear ended.
Traffic Jam Pilot is level 3 because it requires the driver to take over when requested.
 
I think Level 3 is dumb too. It's basically a weird hybrid of not self-driving and self-driving. On one hand, it is self-driving because the car is doing 100% of the driving. But on the other hand, it may ask the human driver to take over in some instances. That's why pretty much everybody doing self-driving intends to skip L3 and just go straight to L4/5. If you are going to bother doing self-driving, it makes sense just to go all the way and make a system that is full self-driving where the driver never needs to take over.

And you are right that L3 adds an extra layer of difficulty because the system not just has to self-drive, it also needs to know in advance when it won't be able to self-drive anymore. In some instances, it is not always possible to know that. For example, if you have an emergency like a car suddenly stalls in the middle of the highway, there is no time to warn the driver. In that instance, the self-driving car needs to be able to respond on its own without bothering the driver. So you might as well just go straight to L4/5.



I would say that if NOA can truly handle all highway driving without the driver (no nags) but it notifies the driver in advance to take over when it reaches the end, like it does now with off ramps, then I would call that L3. SAE defines L3 as true self-driving but where the system can request the driver take over. So I think that would fit with L3.

If Tesla expands NOA to include non-highways so that NOA does not need to disengage at an off ramp, but is able to keep going on the local road, then that would be L4/5. That is what Tesla is aiming for.
I would say both of you, you and @mongo did a very good job, likely way better than I, in articulating what I was driving at (no pun intended) with my prior conversations with @electronblue and @EVNow, etc. WIthin the level 3 definition there is enough wiggle room for people to say tesla is level 2 and another level 4 and both be equally right while being equally wrong. Likely why Tesla does not advertise as any of them. Let PC Magazine, Consumer Reports, and Car and Driver hash it out. As I tried to describe to others, with each release I have to reassess every road to determine where I can be comfortable with relaxed attentiveness and where I require alert (or aggressive) attentiveness. For the people I know who have done "sound of Silence rally to Custer, SD or Montreal to Ft Lauderdale, 99% of their trip was, in my terminology, relaxed attentiveness. That to me is really close to FSD but not. In the FSD or not category, it's still not clear to me if a Model 3 or S at HW 2.5 would successfully detect an 18 wheeler pulling out in front of you. I think doing 68 in a 55 zone disables auto breaking in AS. There's a warning but it hasn't lasted long enough for me to get close enough to read it. Which requires more time with eyes off the road than I want to do. Let me put it this way, I would not elect to be the test dummy to find out. Does anyone know for sure? If it still can not then, yeah, Elon needs to really delete his twitter account and concentrate on that.