Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What will happen within the next 6 1/2 weeks?

Which new FSD features will be released by end of year and to whom?

  • None - on Jan 1 'later this year' will simply become end of 2020!

    Votes: 106 55.5%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 3.0 vehicles.

    Votes: 55 28.8%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 2.x/3.0 vehicles.

    Votes: 7 3.7%
  • One or more major features (stop lights and/or turns) to all HW 3.0 FSD owners!

    Votes: 8 4.2%
  • One or more major features (stop lights and/or turns) to all FSD owners!

    Votes: 15 7.9%

  • Total voters
    191
This site may earn commission on affiliate links.
If “automatic city driving” is equivalent to NOA then it’s below borderline useless

We don't know that. Yes, Tesla will use the NOA system (single blue line on the screen) for "automatic city driving" but that does not necessarily mean that it will be just like the current NOA on the highway. In fact, we know that "automatic city driving" will require a lot of extra capabilities that the current NOA on the highway does not have. So "city NOA" if we want to call it that, will necessarily have to be better than "highway NOA".
 
  • Like
Reactions: APotatoGod
In a sense, we are all beta testers for FSD right now since NOA, Smart Summon etc are labelled as FSD features and are still considered beta after they are released to the public.

But if you mean, a beta tester for the full FSD that is still in development and not released yet, that is only to "Early Access" members. "Early Access" is by invitation only, meaning Tesla has to pick you. And as far as we know, Tesla is not inviting any new members into the early access program at this time.
Thank you, I did not know.
 
@verygreen tweeted this:

"as I keep wading through the firmware, I just found "CityStreetsBehavior" code that's all new, so there does appear to be some rushing in that area." green on Twitter

This is a concrete proof that Tesla is moving closer to releasing "automatic city driving".

Keep reading his tweet stream, because this is a cherry pick to put it lightly. He's been pretty clear about the problems he's seeing, and the shortcomings as well. Especially with stop signs/lights that are physically present but not listed in map data. This is a dangerous way to approach FSD. If the car will gladly sail right through an intersection because a map doesn't list a stop sign or light, someone's going to get T-boned.

As I've said before, if Tesla's idea here is to release NoA on surface streets, this is a big misstep. We don't yet know what citystreetsbehavior is or does, but we've got 4 weeks before FSD is "feature complete" and I'm still predicting end of year will be a miss.
 
Keep reading his tweet stream, because this is a cherry pick to put it lightly. He's been pretty clear about the problems he's seeing, and the shortcomings as well. Especially with stop signs/lights that are physically present but not listed in map data. This is a dangerous way to approach FSD. If the car will gladly sail right through an intersection because a map doesn't list a stop sign or light, someone's going to get T-boned.

As I've said before, if Tesla's idea here is to release NoA on surface streets, this is a big misstep. We don't yet know what citystreetsbehavior is or does, but we've got 4 weeks before FSD is "feature complete" and I'm still predicting end of year will be a miss.

Definitely, Tesla will miss their end of the year deadline for "feature complete".

I am also well aware of the rest of verygreen's tweet about the problems that AP currently has. I think Tesla is also well aware of the unacceptable dangers of releasing software where the car could run a red light. That is why they have not released traffic light response yet. Tesla is waiting until the software is good enough where it won't run a red light before releasing it.

I am sure Tesla will update maps for the final release version of traffic light response. No, Tesla is not going to release traffic light response based on incomplete maps. That would indeed be dangerous and ludicrous. Tesla won't do that. Tesla will update maps but also perfect camera vision. When Tesla releases traffic light response, it will be based on reliable camera vision and correct maps.

But I think the name "CityStreetsBehavior" does hint that Tesla is testing at least a piece of "automatic city driving". My guess is that the code probably relates to some unspecified AP behavior on city streets. Tesla is testing that part of the feature. And we know Tesla is also testing traffic light response but it is not good enough yet. When everything is good enough and safe enough, then Tesla will release it.
 
My main point is that this fear that some of you have that Tesla will release traffic light response that relies on bad map data so that the car will plow through red lights, is completely unfounded IMO. Tesla is not going to release traffic light response with such an obviously dangerous flaw. If traffic light response does rely on map data, Tesla would need to make sure the map data was accurate. But we know that Tesla is working to get camera vision perfect where the car will correctly see red lights without needing map data.
 
  • Like
Reactions: APotatoGod
My main point is that this fear that some of you have.....is completely unfounded IMO.

...we know that Tesla is working to get camera vision perfect where the car will correctly see red lights without needing map data.
I believe a more accurate way to write this is that you are contrasting your hope with their fear. You are most certainly not contrasting your knowledge with their fear. Indeed, if anything, you are contrasting your hopes with their experiences.
 
I believe a more accurate way to write this is that you are contrasting your hope with their fear. You are most certainly not contrasting your knowledge with their fear. Indeed, if anything, you are contrasting your hopes with their experiences.

They seem to be basing their argument on the fact that the current AP is using bad map data and also reports that the alpha traffic light response is not reliable yet. And they are assuming Tesla will release traffic light response in basically the same current state. Why would Tesla release traffic light response in a state where it could run red lights and be some obviously dangerous? That makes no sense.
 
Tesla is extremely safety-conscious. That's why they design all their cars to get 5 stars in every safety category. They won't release stoplight recognition until it's more accurate than a very good driver. This means they won't release it for a very long time yet. Maybe a warning when it thinks you're about to run a red light. But not a feature where the car is expected to respond correctly.

EAP is still a beta feature all these years later. Even HW3 is not going to make EAP reliable enough to remove the beta label and allow hands off the wheel & eyes off the road. Elon underestimated the difficulty of the task. But Tesla will eat the bad publicity for missed deadlines and unfulfilled promises before they release an unsafe version of autopilot or self-driving.

I love my Model 3 with EAP. It does everything they said it would do when I bought it: It stays in its lane and adjusts its speed to traffic. Unless I want to go more than 5 mph over the limit. Then I have to steer. I sympathize with the people who thought, three years ago, that for $6,000 (?) they'd have true full-self-driving in a few months. All I can offer in consolation is that Tesla will not release unsafe software. We're driving the safest cars on the road, and that's my top priority. I'll pay for self driving when it's available. It will cost me more, but I won't get angry because I paid for something that finally arrives when my car is ready to be recycled.
 
My main point is that this fear that some of you have that Tesla will release traffic light response that relies on bad map data so that the car will plow through red lights, is completely unfounded IMO. Tesla is not going to release traffic light response with such an obviously dangerous flaw. If traffic light response does rely on map data, Tesla would need to make sure the map data was accurate. But we know that Tesla is working to get camera vision perfect where the car will correctly see red lights without needing map data.

They released "smart" summon that relied on exactly this, and it has had everything from hilarious results to destructive and dangerous results. Including someone's car driving off the edge of a lot and beaching itself. That kind of ignoring visual data is, IMO, dangerous and shows Tesla is playing fast and loose with these cars.

Deadline pressures force unnecessary errors that cause pilots to crash planes. There's no reason to think accidents won't happen with EAP/NoA/FSD.
 
  • Like
Reactions: am_dmd and emmz0r
I'm curious to see some Tesla owners track on the compute load on their FSD Computers. In April, Elon tweeted:

“The Tesla Full Self-Driving Computer now in production is at about 5% compute load for these tasks [i.e. Navigate on Autopilot] or 10% with full fail-over redundancy”
The same day Elon also tweeted that that the compute load on HW2.5 was “~80%”.

Presumably Tesla wants to use most if not all of the FSD Computer's compute. Before the city driving features go to wide release, presumably Tesla will want to run them passively, i.e. in shadow mode. So, if Tesla owners with HW3 can track the compute load in their cars, we should be able to tell when the city driving features are released in shadow mode.

I suppose it's possible the city driving features will be like Summon and will be able to run on HW2. But my guess is that these features will accompany much larger neural networks that use a lot more compute.

In October 2018, Karpathy said:

“...my team trains all of the neural networks that analyze the images streaming in from all the cameras for Autopilot. For example, these neural networks identify cars, lane lines, traffic signs, and so on. The team is incredibly excited about the upcoming upgrade for the Autopilot computer which Pete [Bannon] briefly talked about.​

This upgrade allows us to not just run the current neural networks faster, but more importantly, it will allow us to deploy much larger, computationally more expensive networks to the fleet. The reason this is important is that, it is a common finding in the industry and that we see this as well, is that as you make the networks bigger by adding more neurons, the accuracy of all their predictions increases with the added capacity.​

So, in other words, we are currently at a place where we trained large neural networks that work very well, but we are not able to deploy them to the fleet due to computational constraints. So, all of this will change with the next iteration of the hardware. And it's a massive step improvement in the compute capability. And the team is incredibly excited to get these networks out there.”
It's been over a year and, as far as I know, we haven't seen new neural networks running on HW3 that are too computationally intensive to run on HW2. I would guess that's what's supposed to be coming with the city driving features.
 
Last edited:
  • Informative
Reactions: APotatoGod
They released "smart" summon that relied on exactly this, and it has had everything from hilarious results to destructive and dangerous results. Including someone's car driving off the edge of a lot and beaching itself. That kind of ignoring visual data is, IMO, dangerous and shows Tesla is playing fast and loose with these cars.

Deadline pressures force unnecessary errors that cause pilots to crash planes. There's no reason to think accidents won't happen with EAP/NoA/FSD.

I figured someone would bring up smart summon. But I don't think it is the same thing at all. The incidents on Smart Summon were minor and none of them were as serious as "running a red light".

Sure, accidents will happen with City FSD. That is inevitable with any new technology. But Tesla will certainly try to minimize them as much as possible by making it as reliable as possible and also encouraging the driver to pay attention at all times.
 
  • Like
Reactions: APotatoGod
AFAIK they don't even read speed limits, and they cheaped out on paying Mobileye for the patent. So what they do now is to try to rely on map data. Tesla just needs to bite the bullet and pay MobilEye, and just get it done.
 
Question: Is NoA/city worth having if it's a beta feature like Autopilot today, and the driver is still responsible for over-riding it when it cannot handle a situation? Autopilot on the highway works as a beta/Level-2 feature because on the highway I can see situations that may be hard for the car well before they arrive. But if the car is supposed to stop for red lights and stop signs, with less than human-level errors (both false positives and false negatives) I don't see how a human driver could intervene quickly enough if the car makes a mistake. Right now, if a car approaching from the other direction makes a left turn in front of me, my car will brake even if it's so far away that there's absolutely no need for it. I can imagine NoA/city sitting in a left-turn lane (or worse yet, the left lane where there's no turn lane) waiting to turn and refusing to do so until there's no oncoming car within a hundred yards.

I just don't see them having the software for the promised features at above Level 2 any time soon, and I think that NoA/city has to be at least Level 3 to be of much use.

If HW3 gives the car the ability to go hands-free and eyes-off-the-road for highway autopilot, I will trade in mine for the new one. I don't see how a useful, functioning NoA/city can come before Level 3 autopilot on the highway.

The biggest issue I have with EAP now is that we have highways where the flow of traffic is always 15 mph over the posted limit, and autosteer won't work at more than 5 mph over. Our posted limits are far below the safe speed for the roads, and are not enforced. Except by my car when I engage autosteer.
 
Question: Is NoA/city worth having if it's a beta feature like Autopilot today, and the driver is still responsible for over-riding it when it cannot handle a situation? Autopilot on the highway works as a beta/Level-2 feature because on the highway I can see situations that may be hard for the car well before they arrive.

I expect that we'll first see Level 2 "feature complete full self driving" and later Level 3. The first phase will include the risk of not spotting a stop sign or negotiating a turn well, so it's up to you to keep tabs on it, but should do pretty well. The second phase will be Elon's definition of "It's perfect, I'm just waiting for the regulators" but in reality will still feature the car getting overwhelmed or uncertain of the situation and demanding you take over as it occasionally does today. During this phase, I expect the nags to go away and you'd reasonably be able to expect to not have to pay attention until the car tells you to, but with some risk since you'd have to ascertain your surroundings and react quickly. I expect it to live in that state for years before it's actually capable of driving on its own in most scenarios, much less approved by regulators to do so. But I'd be happy with that level and think it's fair to call it "full self driving" - it will drive itself, except when it can't. The leap from Level 2 to 3 seems imminent to me. The leap from 3 to 4 is massive.
 
  • Informative
Reactions: APotatoGod
Question: Is NoA/city worth having if it's a beta feature like Autopilot today, and the driver is still responsible for over-riding it when it cannot handle a situation?

I think that will largely depend on how good the feature is and the type of driving you currently do. Just like there are folks now who find NOA very useful while others do not. Obviously, if you need to intervene a lot, it probably won't very useful to you. But if "automatic city driving" is able to handle your type of daily driving with little to no intervention, then it will be more useful.

Autopilot on the highway works as a beta/Level-2 feature because on the highway I can see situations that may be hard for the car well before they arrive. But if the car is supposed to stop for red lights and stop signs, with less than human-level errors (both false positives and false negatives) I don't see how a human driver could intervene quickly enough if the car makes a mistake.

It's definitely doable to intervene in time if you are paying attention and thinking ahead. For example, when approaching a stopped car at a red light, AP used to be bad at braking in time. That problem has since been fixed. But when it was a problem, I did not wait to see if AP would brake in time. When I was still far ahead from the stopped car, if AP did not start braking when I would normally start braking in that situation, I disengaged AP and took control. The same will be true for a red light. If you are approaching a red light and AP does not start to brake when a normal driver would start braking then you immediately disengage. You don't wait to see if AP will really stop or not. In other words, you intervene as soon as AP deviates from what you would normally have done.

And there might be some instances where you disengage early if you are not sure that AP can handle it. For example, if you see that you are coming up to an intersection with a busted traffic light, you should disengage early before you even get close to the intersection. Don't wait to the last minute to see if "automatic city driving" can handle that situation. If you are paying attention, you will have plenty of time to see a busted traffic light before it is time to disengage.

Right now, if a car approaching from the other direction makes a left turn in front of me, my car will brake even if it's so far away that there's absolutely no need for it. I can imagine NoA/city sitting in a left-turn lane (or worse yet, the left lane where there's no turn lane) waiting to turn and refusing to do so until there's no oncoming car within a hundred yards.

This problem has largely been solved in my driving experience. I find that AP does not brake anymore in those instances when it does not need to brake.

I just don't see them having the software for the promised features at above Level 2 any time soon, and I think that NoA/city has to be at least Level 3 to be of much use.

If HW3 gives the car the ability to go hands-free and eyes-off-the-road for highway autopilot, I will trade in mine for the new one. I don't see how a useful, functioning NoA/city can come before Level 3 autopilot on the highway.

I remain optimistic that HW3 will be a quantum leap in capability.

The sad thing is that they could already do this but the patent is stopping them because mobileye described it so well, and it's hard to weasel around

I do think part of the problem is that Tesla insists on reinventing the wheel by doing their own in house NN that will do everything. I am pretty sure that there are commercially available, off the shelf NN for reading speed limit signs, traffic lights etc... that Tesla could have simply purchased and plugged in. But Tesla wants to do everything in house. It's like the issue with auto wipers. Tesla could have simply purchased a rain sensor but instead they insisted on reinventing the wheel with their own in house deep rain NN.
 
  • Like
Reactions: APotatoGod