Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What will happen within the next 6 1/2 weeks?

Which new FSD features will be released by end of year and to whom?

  • None - on Jan 1 'later this year' will simply become end of 2020!

    Votes: 106 55.5%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 3.0 vehicles.

    Votes: 55 28.8%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 2.x/3.0 vehicles.

    Votes: 7 3.7%
  • One or more major features (stop lights and/or turns) to all HW 3.0 FSD owners!

    Votes: 8 4.2%
  • One or more major features (stop lights and/or turns) to all FSD owners!

    Votes: 15 7.9%

  • Total voters
    191
This site may earn commission on affiliate links.
In Post #87 above, we see that the first appearance of stop-sign and stoplight recognition is in the form of recognition and warning, but will not stop the car. So the driver is still 100% responsible for stopping the car. Presumably Tesla will use data from this warning-only feature to refine its accuracy. Then, based on everything up to now, it will progress to a beta Level 2 feature, where it stops the car but the driver is required to be alert with hands on the wheel to override AP for false positives and false negatives.

We are (in my opinion) years away from Level 3, where the driver is permitted to move their attention elsewhere. And as long as this feature is Level 2, much of the above speculation is moot. There is a small but real risk that your car will kill you if you fail to maintain your attention on the road. Nothing changes with stoplight/stop-sign recognition. The people who are concerned that the car will plow through stoplights are assuming that the feature will operate at Level 3 where the driver is no longer expected to react immediately.

I believe that achieving Level 3 in the city, where stoplights and other concerns are significant issues, will be much harder than achieving Level 3 on the highway, and that seems far away still. So I think it's premature to worry about AP or NoAP in the city as they will be Level 2 for a long time.

Personally, while stop-sign and stoplight recognition are critical for eventual FSD, I don't find it to be a particularly useful feature by itself. I already have to be alert. Stopping for lights and signs just doesn't significantly alter my experience of EAP, where the lane-keeping and speed control are the really great benefits.
 
  • Like
Reactions: diplomat33
The people who are concerned that the car will plow through stoplights are assuming that the feature will operate at Level 3 where the driver is no longer expected to react immediately.
I am concerned that users will expect the car to stop at a light that it stops at 99.9% of the time and that will cause them to accidentally run the light.
I am skeptical that Tesla will release anything other than emergency stopping at red lights and stop signs in the near future.
 
I am concerned that users will expect the car to stop at a light that it stops at 99.9% of the time and that will cause them to accidentally run the light.

This is a real issue, but not limited to stoplights: AP on the highway works so well that a small number of reckless people will assume they don't need to pay attention. I don't think this is a good reason not to move forward, and the fact is that we will not get to FSD without moving through near-flawless driver assist. And we won't get to Level 3 without moving through an implementation of Level 2 that is so good you virtually never need to intervene.

Do we stick with thirty-thousand deaths per year (human drivers) just because FSD will never get us to zero deaths?

I am skeptical that Tesla will release anything other than emergency stopping at red lights and stop signs in the near future.

There's no hint as yet of any stopping at stoplights and stop signs, emergency or otherwise, any time soon. Maybe some time in 2020 or 2021 HW3/FSD will get to stopping at lights and stop signs as a Level 2 (usually works, but keep-your-damn-eyes-on the-road) feature.

I think that within ten years I'll be able to buy a Tesla that will allow me to let my attention wander when I'm on the highway, and will navigate turns and stop signs and stop lights in the city at Level 2 (constant driver attention needed). A year ago I thought that tobotaxi-level FSD was a decade away. Now I fear it may be two.
 
  • Like
Reactions: pilotSteve
I think that within ten years I'll be able to buy a Tesla that will allow me to let my attention wander when I'm on the highway, and will navigate turns and stop signs and stop lights in the city at Level 2 (constant driver attention needed). A year ago I thought that tobotaxi-level FSD was a decade away. Now I fear it may be two.

I get that people are skeptical of Tesla achieving true FSD but 20 years, really?!? I seriously doubt that it will take Tesla that long. 5-10 years, but not 20 years.
 
Do we stick with thirty-thousand deaths per year (human drivers) just because FSD will never get us to zero deaths?
I don't think it's necessary to have an increase in deaths before we have a decrease. If the feature saves lives when used by real people that's great. I'm just skeptical that it will, especially when compared to a system that only does emergency stops (i.e. very uncomfortable with lots of beeping).
 
  • Like
Reactions: APotatoGod
I think that quote is misattributed...

Yep. He attributed my words to Daniel in SD. Probably an editing error since I quoted Daniel in SD in the same post where I said what Diplomat33 quoted.

I get that people are skeptical of Tesla achieving true FSD but 20 years, really?!? I seriously doubt that it will take Tesla that long. 5-10 years, but not 20 years.

Autonomous driving is proving to be a much more difficult nut to crack than its proponents (including me) imagined a couple of years ago. We'll get gradual improvements and new features, but they will continue to require constant driver attention for a very long time to come.

I don't think it's necessary to have an increase in deaths before we have a decrease. If the feature saves lives when used by real people that's great. I'm just skeptical that it will, especially when compared to a system that only does emergency stops (i.e. very uncomfortable with lots of beeping).

I do not believe that there will be an increase in deaths with autopilot. I feel EAP makes me a safer driver already. However, there will be deaths that would not have happened before. That's the dilemma: You'll save ten lives while killing one person who would not have died otherwise. On another forum I frequent there is an occasional visitor who insists that this is unacceptable. If a machine will kill one person, he thinks it should not be permitted, even if it saves twenty thousand. But it's unavoidable: Computers do not make the kind of mistakes that people make, but they make other mistakes that people don't make.

Responsible automakers (which I think Tesla is) will release features only when they make the car safer. But they can never be 100% safe. And responsible regulators will authorize systems when and only when they are safer than human drivers. But you never get to 100%.

When my father was in his 80's he was driving on a freeway in L.A. with me and my sister in the car, and he started to drift into the adjacent lane where there was another car. I told him, as calmly as I could, "You're drifting into the next lane," and he corrected. That was a very close call. This was 15 years ago. If he'd been driving my Model 3 (which was more than a decade in the future) that would not have happened. AP/EAP has already made the roads a little safer, and that will continue. But robotaxi-level FSD is ten to twenty years away and I fear I will not live to see it.
 
Autonomous driving is proving to be a much more difficult nut to crack than its proponents (including me) imagined a couple of years ago. We'll get gradual improvements and new features, but they will continue to require constant driver attention for a very long time to come.

Do you mean for Tesla or in general? If you are speaking in general, I am not sure I would agree because Waymo has L4 autonomy now and has robotaxis operating now with no safety driver at all in geofenced areas. So I would say that Waymo at least has certainly cracked autonomous driving, albeit only in geofenced areas.

If you are speaking for Tesla, I might agree more, since they are trying to achieve autonomous driving with less hardware than Waymo. Certainly, with Tesla's hardware, FSD is more difficult for Tesla. And I would definitely agree that we will see gradual improvements and features from Tesla that will require driver attention for quite some time. Although, I remain more optimistic about how long. I don't think Tesla will need to keep driver attention for 20 years. I think Tesla will reach the point of FSD where they can remove driver attention sooner than 20 years.
 
  • Like
Reactions: Jb1280
Do you mean for Tesla or in general? If you are speaking in general, I am not sure I would agree because Waymo has L4 autonomy now and has robotaxis operating now with no safety driver at all in geofenced areas. So I would say that Waymo at least has certainly cracked autonomous driving, albeit only in geofenced areas.

If you are speaking for Tesla, I might agree more, since they are trying to achieve autonomous driving with less hardware than Waymo. Certainly, with Tesla's hardware, FSD is more difficult for Tesla. And I would definitely agree that we will see gradual improvements and features from Tesla that will require driver attention for quite some time. Although, I remain more optimistic about how long. I don't think Tesla will need to keep driver attention for 20 years. I think Tesla will reach the point of FSD where they can remove driver attention sooner than 20 years.

I think that Waymo's geofenced system knows all the streets in its area. It knows where all the stop signs and stoplights are, all the speed limits, all the turn lanes, etc. It doesn't have to read signs because it knows what they all say. (I don't know this for a fact, but I presume this is the case.) Tesla's goal is to operate anywhere. Since Maui is never likely to be hard-wired into anybody's autonomous car, I'm really only interested in a general-purpose non-geofenced system.

But you raise another point I've been saying for a long time: Tesla is trying to do the job with inadequate sensor hardware. I predict that before Tesla achieves robotaxi-level FSD it will have to buckle under, refund the FSD price people paid before they moved the goalposts, and install considerably more-sophisticated sensors.
 
  • Informative
Reactions: pilotSteve
... Waymo has L4 autonomy now and has robotaxis operating now with no safety driver at all in geofenced areas. So I would say that Waymo at least has certainly cracked autonomous driving, albeit only in geofenced areas. ...
I'm not so bullish. Hope it is true, but waymo has been saying this for several years and hasn't been true. Waymo has shown they are good at marketing stunts. I'll wait till independent videos on youtube show up readily, before I believe it.
 
The right thing to do right now would be for Tesla to refund the money with interest to all those who paid for robotaxi-level FSD, and let them keep the highest-level software features that their hardware can run, as compensation for the misleading promises. Then Tesla should admit that even Level 3 driver-assist features are an unknown time in the future and what they are selling is a slowly-expanding suit of useful features that for the foreseeable future will require constant driver attention.

Such features are immensely useful and when used properly make the car safer than it would be without them. EAP is worth every penny I paid for it. But Tesla over-promised on the FSD package, due to Musk's overly-optimistic assessment of what was possible, and then instead of admitting that and offering refunds, he just moved the goal posts and promised a computer upgrade which seems now to be so difficult to accomplish that some cars will reach the end of their useful life without ever getting them or the promised features.

Musk did say that FSD would be dependent on development and approval of the software, but IMO there was an implied promise that it would come within the time that the typical new-car buyer keeps the car. At some reasonable time point Tesla owes those people a refund, or, at the owner's discretion, the right to transfer the FSD package to their next car.
 
I think that Waymo's geofenced system knows all the streets in its area. It knows where all the stop signs and stoplights are, all the speed limits, all the turn lanes, etc. It doesn't have to read signs because it knows what they all say. (I don't know this for a fact, but I presume this is the case.) Tesla's goal is to operate anywhere. Since Maui is never likely to be hard-wired into anybody's autonomous car, I'm really only interested in a general-purpose non-geofenced system.

But you raise another point I've been saying for a long time: Tesla is trying to do the job with inadequate sensor hardware. I predict that before Tesla achieves robotaxi-level FSD it will have to buckle under, refund the FSD price people paid before they moved the goalposts, and install considerably more-sophisticated sensors.

Yup because it was clearly the HD Map that stopped the car from plowing through 15+ kids crossing the street.
Shawn on Instagram: “Driving through the neighborhood and saw a #waymo with no driver. It was waiting for kids to clear the crosswalk and had the windows rolled…”
 
  • Funny
Reactions: AlanSubie4Life
Tesla is often quick to release updates with features that are not perfected yet. With NOA on highways it is a major news event every time a Tesla driver hurts or kills himself while using the beta feature.
Automatic city driving has potential to severely backfire because now it is not the tesla drivers that are putting primarily themselves at risk but everybody else being put in jeopardy by Tesla drivers that rely too much on the system.
How many accidents with pedestrians, bicycles, kids will it take before NOA will be restricted by authorities? In EU the tolerance will be very small and senators in US are already preparing their case.

For “automatic city driving” I hope Tesla will take the time it needs to truly perfect it before release - even if “end of year” turns into “end of 2020”.
 
... For “automatic city driving” I hope Tesla will take the time it needs to truly perfect it before release - even if “end of year” turns into “end of 2020”.
The worse it is the better for all. That way people will learn to use it as intended, under supervision. The opposite is also true: the better it is, the worse for everyone, because people will trust it. It will take 3+ years to perfect it, in other words before it is truly driverless in city environments.
 
Last edited:
  • Disagree
Reactions: SandiaGrunt
Tesla is often quick to release updates with features that are not perfected yet. With NOA on highways it is a major news event every time a Tesla driver hurts or kills himself while using the beta feature.
Automatic city driving has potential to severely backfire because now it is not the tesla drivers that are putting primarily themselves at risk but everybody else being put in jeopardy by Tesla drivers that rely too much on the system.
How many accidents with pedestrians, bicycles, kids will it take before NOA will be restricted by authorities? In EU the tolerance will be very small and senators in US are already preparing their case.

For “automatic city driving” I hope Tesla will take the time it needs to truly perfect it before release - even if “end of year” turns into “end of 2020”.

The other side of that question is How many accidents with pedestrians, bicycles, kids, will be prevented by NoA in the city, as the car stops for things the driver didn't see, or didn't see in time? Of course, the PR side of this is that an avoided accident is never reported, but an accident with a new technology is reported in every news outlet in the world and talked about for weeks.

However, NoA in the city is very far away from being introduced. First we'll see years where there are warnings but the car does not stop so the driver knows s/he has to.
 
The other side of that question is How many accidents with pedestrians, bicycles, kids, will be prevented by NoA in the city, as the car stops for things the driver didn't see, or didn't see in time? Of course, the PR side of this is that an avoided accident is never reported, but an accident with a new technology is reported in every news outlet in the world and talked about for weeks.

However, NoA in the city is very far away from being introduced. First we'll see years where there are warnings but the car does not stop so the driver knows s/he has to.
I don't see how it would prevent more accidents than the same technology applied to automatic emergency braking.
It seems like proving that city NoA improves safety will be very tricky. When the inevitable accidents happen Tesla is going to have to have solid statistical evidence that it improves safety overall. So far misuse of Autopilot has only injured or killed the driver, it will be very different when a third party is injured or killed.
 
I don't see how it would prevent more accidents than the same technology applied to automatic emergency braking. ...
AEB - automatic emergency braking doesn't work well. Tesla scores the highest in independent tests. It is true that AEB is key. If nothing is moving then there won't be any accidents. Where FSD takes it up a notch, is left turns for example. Is AEB suppose to stop if the left turn isn't safe? That technology isn't AEB, it is FSD.
AEB won't stop people from running stop signs and traffic lights, FSD will.
FSD will have a calming behavior. People driving on FSD won't go into rage mode, or impatient mode. FSD will keep traffic running smother on freeways rather than speeding up and then panick stopping.
FSD will prevent many accidents because it won't have the limitations of AEB, such as detecting stopped items.
 
  • Funny
Reactions: AlanSubie4Life