Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon: "Feature complete for full self driving this year"

This site may earn commission on affiliate links.
Thanks. That's helpful. It sounds like Audi did come up with a good solution for its L3 in cases when the driver is unresponsive.

SAE says:

"Thus, a level 3 ADS, which is capable of performing the entire DDT within its ODD, may not be capable of performing the DDT fallback in all situations that require it and thus will issue a request to intervene to the DDT fallback-ready user when necessary

At levels 4 and 5, the ADS must be capable of performing the DDT fallback and achieving a minimal risk condition. Level 4 and 5 ADS-equipped vehicles that are designed to also accommodate operation by a driver (whether conventional or remote) may allow a user to perform the DDT fallback if s/he chooses to do so. However, a level 4 or 5 ADS need not be designed to allow a user to perform DDT fallback and, indeed, may be designed to disallow it in order to reduce crash risk (see 8.9)."

So it seems like the big difference between L3 and L4 is that L3 will prompt the driver to take over but will pull over if the driver is unable to take over whereas a L4 car needs to be able to do the entire fallback itself and leave the driver out of the process entirely.

Basically Level 3 can provide DDT fallback to a minimal risk condition (if needed), Level 4-5 must provide it.

DDT fallback is separate from failure mitigation, which is what happens on Level 3 when the driver is unresponsive to taking over, and the car takes simple action to stop. This can actually apply to Level 4/5 as well in some catastrophic situations when the entire ADS system and DDT fallback are failing. I guess in theory a manufacturer could make a Level 3 without failure mitigation that actually stops the car, but it would have to be preceded by appropriate warnings and time all the same. The way in which failure mitigation is done on Level 3 has more leeway than Levels 4-5.

The thing to understand about the Levels is that it is the design intent that counts. A car doesn’t stop being Level 5 just because someone encounters one road it can’t handle, and a poorly working Level 4 prototype is not the same as Level 2 even if it requires a safety driver. On the same note a Level 2 system does not become Level 3-5 just because ”it looks like it”, it must have designation by manufacturer.
 
Last edited:
Basically Level 3 can provide DDT fallback to a minimal risk condition (if needed), Level 4-5 must provide it.

DDT fallback is separate from failure mitigation, which is what happens on Level 3 when the driver is unresponsive to taking over, and the car takes simple action to stop. This can actually apply to Level 4/5 as well in some catastrophic situations when the entire ADS system and DDT fallback are failing. I guess in theory a manufacturer could make a Level 3 without failure mitigation that actually stops the car, but it would have to be preceded by appropriate warnings and time all the same. The way in which failure mitigation is done on Level 3 has more leeway than Levels 4-5.

Thanks. I know this now thanks to the SAE document you recommended. :)

The thing to understand about the Levels is that it is the design intent that counts. A car doesn’t stop being Level 5 just because someone counters one road it can’t handle, and a poorly working Level 4 prototype is not the same as Level 2 even if it requires a safety driver. On the same note a Level 2 system does not become Level 3-5 just because ”it looks like it”, it must have designation by manufacturer.

I agree. Which is why Waymo cars are not L2 because they have a safety driver as some on this forum think. Waymo cars are L4 because Waymo has designed them to be L4. The safety drivers are only because the cars still require some testing and validation.
 
How do you explain the problems that companies that are actually testing much more capable autonomous vehicles are having in keeping their test drivers attentive?
There's already a huge number of posts of why your interpretation of Lex Fridman's research is flawed. Here's a selection of quotes from the paper contradicting your claim: What the chances Tesla cars will be self driving in 3 years? Why do you think that way?
I think the problem is with level 3/4 systems i.e. where you can go days before a problem occurs. That is where the system should be capable of warning and be able to safely park itself without intervention.
 
This isn't really about SAE levels, most consumers have no idea what SAE levels are. The issue is that users of level 2 systems are already starting to consciously exhibit "relaxed attentiveness" because they observe that the system is capable of controlling the vehicle in most common situations.
As you were directly referencing my explanation I'll fill in a misconception you may have. My "relaxed' attentiveness isn't simply a warm fuzzy. It's that I know, from multiple traversals of a given segment of a trip AS or NoA have 100% success of navigating that segment. This, in contrast to the heightened attentiveness where AS or NoA has demonstrated it cannot 100% navigate that segment. It's all empirically based. By relaxed, I don't mean taking a nap, I mean I don't have a tight grip on the steering wheel, and I'm not in a 'danger close' state.

On the larger issue. Who cares what SAE level some car achieves? I mean, at some level sure. In practical terms though, I believe Musk is not aiming for 6 9's guarantee of proper operation, 5 9's being carrier grade. He has referenced multiple times twice as good as NTSB stats for fatalities / accidents per million miles. During his Ark interview he was unequivocal FSD would be done this year. Tesla is a public company, as opposed to SpaceX. What he says matters legally. Specifically, he said it, hesitated then doubled down on the claim. Tesla is on round 2 of employee-only FSD testing. They have to know where they are in the trajectory to having something available for 'beta'. In other words, the company is not assuming liability, the driver is still responsible for the actions of the vehicle, but that level of software is available to owners as AS and NoA are now.

Different subject. There is way too much anonymity on here. Who on here is an automotive engineer? Who on here is actually working on self driving technology at Tesla? Being a little more blunt, who here knows what they are talking about? :) For myself, I'd like to know who I am talking to. To be sure, there are some who clearly know what they are talking about although I haven't the vaguest idea who they are. Asking for a friend.
 
This isn't really about SAE levels, most consumers have no idea what SAE levels are. The issue is that users of level 2 systems are already starting to consciously exhibit "relaxed attentiveness" because they observe that the system is capable of controlling the vehicle in most common situations. This isn't an attack on @wcorey, it's simply human nature. Imagine if NoA gets so good that it requires user intervention only once a year for the average driver, will the average driver still be fully attentive? I'm skeptical.

I think what you really mean is complacency. If a driver gets too complacent when using an ADAS, that's definitely bad and could lead to an unfortunate accident. But I would argue that if you still pay attention but you are more relaxed because you know what the ADAS is capable of, that is not a bad thing. For example, there is a long, straight, well marked state road that I use AP on. I drive it every day. AP has handled it perfectly hundreds of times. I am still attentive, eyes on road, but I am less stressed because I know AP can handle the lane keeping part so I just watch for other cars.
 
  • Like
Reactions: Daniel in SD
This isn't really about SAE levels, most consumers have no idea what SAE levels are. The issue is that users of level 2 systems are already starting to consciously exhibit "relaxed attentiveness" because they observe that the system is capable of controlling the vehicle in most common situations. This isn't an attack on @wcorey, it's simply human nature. Imagine if NoA gets so good that it requires user intervention only once a year for the average driver, will the average driver still be fully attentive? I'm skeptical.
I wish I thought up 'functional vigilance" ! Relaxed attentiveness: Few non-trajectory eye diversions, ie. roof, passenger. Probably 95% focused on road in front of me. Diversions less than a second or two per. Response time 95% < 1 sec. 99%+ under 2 secs. Not white knuckle driving. Alert attentiveness closer to 'white-knuckle' driving. OK?
 
I think what you really mean is complacency. If a driver gets too complacent when using an ADAS, that's definitely bad and could lead to an unfortunate accident. But I would argue that if you still pay attention but you are more relaxed because you know what the ADAS is capable of, that is not a bad thing. For example, there is a long, straight, well marked state road that I use AP on. I drive it every day. AP has handled it perfectly hundreds of times. I am still attentive, eyes on road, but I am less stressed because I know AP can handle the lane keeping part so I just watch for other cars.
BINGO - relaxed attentiveness, by my definition. On that segment of road have you ever looked up through the sunroof?
 
  • Like
Reactions: diplomat33
Different subject. There is way too much anonymity on here. Who on here is an automotive engineer? Who on here is actually working on self driving technology at Tesla? Being a little more blunt, who here knows what they are talking about? :) For myself, I'd like to know who I am talking to. To be sure, there are some who clearly know what they are talking about although I haven't the vaguest idea who they are. Asking for a friend.

I am a 41 year old white male with a masters degree in physics. My pic in my profile. :)

No, I don't work in self-driving. But I am fascinated with the idea of self-driving cars and I try to learn as much as I can. I am not an expert but I like to think I am not a total ignoramus either. :)
 
As you were directly referencing my explanation I'll fill in a misconception you may have. My "relaxed' attentiveness isn't simply a warm fuzzy. It's that I know, from multiple traversals of a given segment of a trip AS or NoA have 100% success of navigating that segment. This, in contrast to the heightened attentiveness where AS or NoA has demonstrated it cannot 100% navigate that segment. It's all empirically based. By relaxed, I don't mean taking a nap, I mean I don't have a tight grip on the steering wheel, and I'm not in a 'danger close' state.

On the larger issue. Who cares what SAE level some car achieves? I mean, at some level sure. In practical terms though, I believe Musk is not aiming for 6 9's guarantee of proper operation, 5 9's being carrier grade. He has referenced multiple times twice as good as NTSB stats for fatalities / accidents per million miles. During his Ark interview he was unequivocal FSD would be done this year. Tesla is a public company, as opposed to SpaceX. What he says matters legally. Specifically, he said it, hesitated then doubled down on the claim. Tesla is on round 2 of employee-only FSD testing. They have to know where they are in the trajectory to having something available for 'beta'. In other words, the company is not assuming liability, the driver is still responsible for the actions of the vehicle, but that level of software is available to owners as AS and NoA are now.

Different subject. There is way too much anonymity on here. Who on here is an automotive engineer? Who on here is actually working on self driving technology at Tesla? Being a little more blunt, who here knows what they are talking about? :) For myself, I'd like to know who I am talking to. To be sure, there are some who clearly know what they are talking about although I haven't the vaguest idea who they are. Asking for a friend.
I think Elon Musk explains the problem quite well:
"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it."
Elon Musk 2018 Q1 conference call.

I'm an electrical engineer with an addiction to arguing with people on the internet. :(
SAE Level 3-5 means a HUGE amount to me since it would allow me to be a passenger in my car instead of a driver. Tesla would be responsible for any accidents that occur instead of me.
 
  • Helpful
  • Like
Reactions: GSP and diplomat33
I am a 41 year old white male with a masters degree in physics. My pic in my profile. :)

No, I don't work in self-driving. But I am fascinated with the idea of self-driving cars and I try to learn as much as I can. I am not an expert but I like to think I am not a total ignoramus either. :)
I loved physics in school and undergrad. I went into software where I stayed until very recently. I still do development. I, too, am fascinated with the idea of self driving and also try to learn as much as I can. As evidenced, I am not an expert on it by any stretch. Oh, also a commercial pilot, not to be confused with ATP. I didn't mean to lash out if people took me as doing that. I don't like my words being used derogatorily against me much. And twitter is the worst for anonymity. I think there as people can get away with saying things to someone they wouldn't dare to in person or with their identity associated with it. Oh, and I just turned 68, still white male. My last 3 cars were Priuses, last being 2012 Plugin Adv with adaptive CC. When I was working I drove ~ 35k miles/yr. I want to see a launch from KSC, a 23hr trip according to Tesla/trips. If the bulk of that trip I can relax, watch the road, precisely what you described I figure maybe I could do 12hrs/day. If I were driving the Prius, 6-8 would likely be my limit of holding attentiveness. Another cross country is to SE Ohio, my alma mata ... just because it's there. That is a 12 hr trip. It's beautiful country out there.
And thanks. And, yes, it's very obvious you quite knowledgeable on the subject. Way more than I. I, however, do believe Musk when he told Ark, FSD finished end of this year and robo-taxis next. I am, however, distressed, there are people who bought Model S in 2015 and said Musk told them 12 months also. Employees are testing FSD for Pete sake.
 
with an addiction to arguing with people on the internet. :(

I have the same addiction.

I loved physics in school and undergrad. I went into software where I stayed until very recently. I still do development. I, too, am fascinated with the idea of self driving and also try to learn as much as I can. As evidenced, I am not an expert on it by any stretch. Oh, also a commercial pilot, not to be confused with ATP. I didn't mean to lash out if people took me as doing that. I don't like my words being used derogatorily against me much. And twitter is the worst for anonymity. I think there as people can get away with saying things to someone they wouldn't dare to in person or with their identity associated with it. Oh, and I just turned 68, still white male. My last 3 cars were Priuses, last being 2012 Plugin Adv with adaptive CC. When I was working I drove ~ 35k miles/yr. I want to see a launch from KSC, a 23hr trip according to Tesla/trips. If the bulk of that trip I can relax, watch the road, precisely what you described I figure maybe I could do 12hrs/day. If I were driving the Prius, 6-8 would likely be my limit of holding attentiveness. Another cross country is to SE Ohio, my alma mata ... just because it's there. That is a 12 hr trip. It's beautiful country out there.
And thanks. And, yes, it's very obvious you quite knowledgeable on the subject. Way more than I. I, however, do believe Musk when he told Ark, FSD finished end of this year and robo-taxis next. I am, however, distressed, there are people who bought Model S in 2015 and said Musk told them 12 months also. Employees are testing FSD for Pete sake.

Thanks for sharing. Nice to find someone else who likes Physics.

I am optimistic that Tesla will get to FSD "soon" because I think Tesla is on the right track now. But sadly for veteran Model S owners, I think Elon overestimated FSD back in 2015.
 
I think Elon Musk explains the problem quite well:
"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it."
Elon Musk 2018 Q1 conference call.

I'm an electrical engineer with an addiction to arguing with people on the internet. :(
SAE Level 3-5 means a HUGE amount to me since it would allow me to be a passenger in my car instead of a driver. Tesla would be responsible for any accidents that occur instead of me.
Thanks Daniel. I kinda got a sense you love a good intense conversation. I generally do too, unless I am not holding my own at least. :)

I don't believe I'm complacent. Prior to the Tesla I found I have to spend a greater amount of attention to my driving than I did, say, 30 yrs ago. But I've been accident free since early 80s.
As I was saying to @diplomat33 I find conversations or just reading other posts unnerving on Twitter as most everyone except perhaps well known names are completely anonymous. I don't understand that except as a position to safely attack others with impunity. It's still unnerving. I don't expect high etiquette but...
Anyway, thanks!
 
I am optimistic that Tesla will get to FSD "soon" because I think Tesla is on the right track now. But sadly for veteran Model S owners, I think Elon overestimated FSD back in 2015.
I think Musk set everyone else back with his retort on Lidar verse cameras. People were saying Musk simply didn't get it. I thought he got it perfectly well. We use only our binocular vision to drive. We can tell distance by using our two eyes. I, actually do not have binocular vision. Both eyes work just not together. I was always last or next to last for jr high baseball in gym class. However I have adopted other ways to judge distances. For instance, I can pull up a car right to the edge of the parking space or right to a foot or two from another car. I just don't have fused vision like most do. I am distressed on the apparent lack of clarity on whether a Tesla can detect the side of a barn door (side of an 18 wheeler) and not ram it. Hopefully they no longer have that issue. If that's the case they should announce it.
 
I am optimistic that Tesla will get to FSD "soon" because I think Tesla is on the right track now. But sadly for veteran Model S owners, I think Elon overestimated FSD back in 2015.
IMO TBD. I think they can get to City NOA this/next year. Level 3/4 starts getting difficult for Tesla, given huge liability implications. They really have to be at 7 9s to take on the liability. I don't think anyoe thinks Tesla will get to robotaxi (lvl 5) next year.

Ofcourse, robotaxi is always 5 years away.
 
They have to do away with nags. How can you have your hands on the wheel when it's violently turning the wheel around tight turns in city driving? You have to take your hands off, then on, then off again etc. It will be 10 times more work to drive under those conditions than doing it yourself.

They need to read signs. Speed limits are wrong half the time, then what about stop signs etc? I expect at the very minimum sign reading will come.
 
  • Like
Reactions: JeffnReno
Level 3/4 starts getting difficult for Tesla, given huge liability implications. They really have to be at 7 9s to take on the liability.

If we are talking about L3/4 in all cases, I agree. But I think Tesla could do L3 now in limited situations if they really wanted to. For example, Tesla could make Autopilot L3 for traffic jams. And NOA could be L3 now for cases like open highways with no traffic. NOA is actually pretty close to self-driving now, there are just few cases, like with heavy traffic, where it is not reliable enough yet.

I think the big reason why Tesla does not do this is because they want their "general solution". They want FSD to work broadly on all roads, rather solve FSD in just a few narrow cases.

They have to do away with nags. How can you have your hands on the wheel when it's violently turning the wheel around tight turns in city driving? You have to take your hands off, then on, then off again etc. It will be 10 times more work to drive under those conditions than doing it yourself.

The hands on wheel is not what people think. You don't need both hands tightly gripping the steering wheel at all times. With NOA, you just need one hand loosely gripping the bottom of the steering wheel to satisfy the nag. So I don't think it would be a big problem.

But I do agree that ultimately, Tesla will need to do away with nags before Autopilot can be true FSD.
 
The hands on wheel is not what people think. You don't need both hands tightly gripping the steering wheel at all times. With NOA, you just need one hand loosely gripping the bottom of the steering wheel to satisfy the nag. So I don't think it would be a big problem.

But I do agree that ultimately, Tesla will need to do away with nags before Autopilot can be true FSD.

Indeed but the problem is that when it violently turns fast it will fling your arm off the wheel :) You can't possibly have a loose grip and be relaxed in situations with sharp turns. If you watch the autonomy day demo you'll see.