Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon: "Feature complete for full self driving this year"

This site may earn commission on affiliate links.
Yep, if the car doesn't detect your hand on the steering wheel, the "apply light force..." prompt and the blue flashing border come up and the car doesn't actually change lanes until it detects you holding the wheel again.

EDIT: the flashing blue border also animates in the direction of the lane change, you can see it in this video in a few places:
(Video from Firmware 8.5 )

What happened to "automatically change lanes WITHOUT requiring driver input".

@S4WRXTTCS are you seeing what i'm seeing?

Pq7pZis.png


kfymDIc.png
 
It can be argued detecting if a driver has their hand on the wheel isn't really "user input." It's not like you're telling the car which way to go.

Yep, it's input in the strict engineering sense but not from the user perspective since you're not prompting changes using some control (button, stalk, touchscreen, etc) but simply doing something you should be doing already (holding the wheel).
 
It can be argued detecting if a driver has their hand on the wheel isn't really "user input." It's not like you're telling the car which way to go.

But you weren't telling the car which way to go before this update.
Before it was: you have to engage the small stalk.
Now its: you have to exert pressure, plus we all know how bad the steering wheel detection system is, which will lead to you having to jigger it.

That's definitely still 'driver input'.
 
But you weren't telling the car which way to go before this update.
Before it was: you have to engage the small stalk.
Now its: you have to exert pressure, plus we all know how bad the steering wheel detection system is, which will lead to you having to jigger it.

That's definitely still 'driver input'.

As long as Tesla requires the driver keep a hand on the wheel, by your definition, it will always require driver input. I think that's a silly bar to set. GM's system requires the driver to keep an eyes on the road, which is also a form of driver input, as eye location is input for the camera system.

I'd say pulling down as a confirmation is a heck of a lot more driver input than keeping your hand on the wheel, essentially as a weight. Judging that you can substitute an orange for "driver input" and still get an automatic lane change, by your definition, an orange is a driver.
 
What happened to "automatically change lanes WITHOUT requiring driver input".

@S4WRXTTCS are you seeing what i'm seeing?

Pq7pZis.png


kfymDIc.png

Tesla did get rid of the driver input TO confirm the lane change (stalk confirmation). The driver input now is the regular AP nag to show driver attention that happens regardless of lane changes. So yes, it is still driver input but the purpose of the input is different. Before there was 2 inputs, one to show attentiveness and another to directly authorize the lane change. Now, there is only the input to show attentiveness. The car is responsible for the lane change without any further input. So it is a big step in the right direction.

But I find it interesting that you skipped over the part of the review that is very impressed with NOA. 0 false positives and more intuitive behavior is a very positive development.
 
  • Like
Reactions: OPRCE and mongo
It's worth noting that we can see a clear progression towards more autonomy in just the evolution of "auto lane change".

"Auto lane change" pre-NOA
- Decision when to do lane change: driver
- Confirmation to do lane change: driver
- Automation of actual lane change: car

NOA version with confirmation
- Decision when to do lane change: car
- Confirmation to do lane change: driver
- Automation of actual lane change: car

NOA version without confirmation
- Decision when to do lane change: car
- Confirmation to do lane change: car but driver can cancel and needs to show attentiveness.
- Automation of actual lane change: car

The car is taking over more and more of the driving.
 
My prediction of features which will be demonstrated at the event:

View attachment 394056

I think the issue is that a lot of features on the list are already done or at least known. Just doing test drives of features that are mostly already available to the public would not be that significant. So I think there needs to be more. Really, I think Tesla needs to "bring everything together". Tesla needs to show that all the features work together to form a meaningful FSD experience. For example, this could be done with a Waymo style ride sharing drive where a Tesla (with a safety driver) takes some passengers across town with minimal disengagements. That would show the public how close Tesla's FSD is to real autonomous ride-sharing. Remember too that the Tesla Network and autonomous ride-sharing is what investors will really be interested in because that's where the potential profit is. Investors aren't interested in the car stopping at a red light or making lane changes without confirmation, they are interested in ride sharing that will make Tesla money and therefore make them money.
 
I think the issue is that a lot of features on the list are already done or at least known. Just doing test drives of features that are mostly already available to the public would not be that significant. So I think there needs to be more.

Pretty sure he was being facetious in a "fool me once, shame on you....fool me twice (or three times or four times) shame on me" kind of way...
 
As long as Tesla requires the driver keep a hand on the wheel, by your definition, it will always require driver input. I think that's a silly bar to set. GM's system requires the driver to keep an eyes on the road, which is also a form of driver input, as eye location is input for the camera system.

I'd say pulling down as a confirmation is a heck of a lot more driver input than keeping your hand on the wheel, essentially as a weight. Judging that you can substitute an orange for "driver input" and still get an automatic lane change, by your definition, an orange is a driver.
I learned a new way to show driver involvement, from this source! All you do is slight turn of either scroll wheel. For me, that's easier than jerking the wheel back and forth.
 
What I don't get is how Tesla could possibly show anything that hasn't been done by a dozen other companies who have done FSD "demos". Achieving FSD isn't about going 10 or 100 or 1000 miles between disengagements, it's about making a system that is safer than human drivers. Obviously that progression is necessary but other companies are going thousands of miles between disengagements and they're still not sure if and when their systems can be deployed.
 
  • Like
Reactions: rnortman
But you weren't telling the car which way to go before this update.
Before it was: you have to engage the small stalk.
Now its: you have to exert pressure, plus we all know how bad the steering wheel detection system is, which will lead to you having to jigger it.

That's definitely still 'driver input'.

I guess I haven't tried the new NoA yet, but I find it completely trivial to maintain constant gentle torque on the wheel so that I NEVER get "hands on the wheel" warnings. I mean, it took like 5 minutes of work to figure out the "feel" necessary when I first started using the car...but after that, no issues.

It really isn't that hard; one would have to be borderline incompetent to not be able to do it successfully 100% of the time after the initial learning phase. Unless the new Seamless NoA requires something more than that, I don't see how this can be construed as "requiring driver input". It does require a driver to have hands (both preferably) on the wheel with 100% attention, as is required for any level 2 system, and for any person who values their life.

There are a lot of legitimate things you can troll about here in regards to the prospects of Tesla actually delivering FSD (I have serious doubts about level 3 from any company, because even Waymo gave up on it and went to getting to level 4/5, because level 3 was too dangerous), but this "hands-on-the-wheel being driver input" thing is a pretty weak point.
 
What I don't get is how Tesla could possibly show anything that hasn't been done by a dozen other companies who have done FSD "demos". Achieving FSD isn't about going 10 or 100 or 1000 miles between disengagements, it's about making a system that is safer than human drivers. Obviously that progression is necessary but other companies are going thousands of miles between disengagements and they're still not sure if and when their systems can be deployed.

I think it is more that Tesla is doing it than what it does (driving). Showing the system operating on HW3 with the current camera + radar setup, will help quiet the 'must have lidar' camp.
Maybe they have interesting real world footage also...
 
  • Like
Reactions: diplomat33
Tesla did get rid of the driver input TO confirm the lane change (stalk confirmation). The driver input now is the regular AP nag to show driver attention that happens regardless of lane changes. So yes, it is still driver input but the purpose of the input is different. Before there was 2 inputs, one to show attentiveness and another to directly authorize the lane change. Now, there is only the input to show attentiveness. The car is responsible for the lane change without any further input. So it is a big step in the right direction.

But I find it interesting that you skipped over the part of the review that is very impressed with NOA. 0 false positives and more intuitive behavior is a very positive development.

As long as Tesla requires the driver keep a hand on the wheel, by your definition, it will always require driver input. I think that's a silly bar to set. GM's system requires the driver to keep an eyes on the road, which is also a form of driver input, as eye location is input for the camera system.

I'd say pulling down as a confirmation is a heck of a lot more driver input than keeping your hand on the wheel, essentially as a weight. Judging that you can substitute an orange for "driver input" and still get an automatic lane change, by your definition, an orange is a driver.

There's absolutely no way in hell you guys can convince me or anyone that this is what tesla owners expected in 2016 when they read that EAP description. There were huge debates in this forum and on /r/teslamotors whether EAP was level 3 or not. Although they were fundamentally wrong they made compelling arguments. Which is, at the time when nag was at several minutes, even after the several nag increases (including you @diplomat33). The thesis was, if the car is merging, lane changing and handing interchange and i only have to touch the wheel every few minutes then how is that not level 3?

Although they were wrong, the argument was very compelling as the basis of a very good system. This iteration of NOA? Not a chance, no one would ever try to pass this off as a kind of Level 3 even in bad faith.

Even the Sept/October 2018 Early access firmware that showed up with unconfirmed lane change hidden and UI enabled by hackers DID NOT have this nag confirmation before a lane change.

Again the whole debate wheather EAP was level 3 revolved around the fact that the car will handle all of highway driving without you confirming anything, only the usual nag requiring you to touch the wheel at random intervals to make sure you are still there. This is what showed up in that October 2018 early access firmware. This however feels like something Elon pushed for the engineers to release so they swapped the stark confirmation with wheel nag confirmation.

 
Last edited:
I think it is more that Tesla is doing it than what it does (driving). Showing the system operating on HW3 with the current camera + radar setup, will help quiet the 'must have lidar' camp.
Maybe they have interesting real world footage also...
I don't think anyone is saying that lidar is necessary to do cool FSD demos even on unplanned routes. Look at all the EAP accidents that could have been avoided with lidar though. I think that lidar would make achieving FSD much easier and that's the reason everyone else is using it. I would imagine once a company has achieved level 4-5 autonomy then the next step will be to remove lidar since it doesn't work in all weather conditions. The vision system is only one piece of the puzzle and probably not even the most difficult piece.
I just don't see the point of doing yet another FSD demo.
 
  • Like
Reactions: rnortman
There's absolutely no way in hell you guys can convince me or anyone that this is what tesla owners expected in 2016 when they read that EAP description. There were huge debates in this forum and on /r/teslamotors whether EAP was level 3 or not....

You seem to be changing what we were discussing. You made the claim the light force on the steering wheel used to prove to the car the wheel is being held constitutes "driver input." Nobody is currently arguing about SAE Level 3.

"The sustained and ODD(operational design domain)-specific performance by an ADS(automated driving system) of the entire DDT(dynamic driving task), with the expectation that the human driver will be ready to respond to a request to intervene when issued by the ADS." SAE Level 3 definition. There's nothing in that definition about what constitutes driver input, or if it's required.
 
I think it is more that Tesla is doing it than what it does (driving). Showing the system operating on HW3 with the current camera + radar setup, will help quiet the 'must have lidar' camp.
Maybe they have interesting real world footage also...

I don't see any lidar here...
What actually happens in the demo ride is what matters because it shows what capability they have now compared to others and as @diplomat33 said they need the entire system functioning as a whole not individual ADAS features.

That would show the public how close Tesla's FSD is to real autonomous ride-sharing. Remember too that the Tesla Network and autonomous ride-sharing is what investors will really be interested in because that's where the potential profit is. Investors aren't interested in the car stopping at a red light or making lane changes without confirmation, they are interested in ride sharing that will make Tesla money and therefore make them money.

Funny how you said this wasn't about the stock, now you are talking about how investors can invest/increase their investment based on this demo day. Yet saying this event has nothing to do with improving the stock.
 
You seem to be changing what we were discussing. You made the claim the light force on the steering wheel used to prove to the car the wheel is being held constitutes "driver input." Nobody is currently arguing about SAE Level 3.

"The sustained and ODD(operational design domain)-specific performance by an ADS(automated driving system) of the entire DDT(dynamic driving task), with the expectation that the human driver will be ready to respond to a request to intervene when issued by the ADS." SAE Level 3 definition. There's nothing in that definition about what constitutes driver input, or if it's required.

You didn't seem to understand my post. My post was, the EAP description in 2016 gave the indication of something far more superior than what was delivered this week. In so much that tesla fans mistakenly believed it constitutes as Level 3 regardless of the need to pay attention or what the SAE defined. But because what was actually delivered is way less than what was expected, no one in the right mind would mistake it or try to pass it off as Level 3.

Therefore this isn't what people expected so saying that nag confirmation is not driver input is revising history.

In addition, the initial hidden release of NOA didn't have the nag confirmation before a lane change (Sept 2018)