Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD on city streets coming later this year with reliability far in excess of human drivers!

Really? :)

  • Yes

    Votes: 37 18.7%
  • No

    Votes: 161 81.3%

  • Total voters
    198
This site may earn commission on affiliate links.
GM and Chrysler were bailed out in the form of government financing to the combined tune of 84 billion dollars to help them smoothly sale through bankruptcy. I don't expect Tesla would get any government support so it may not be so smooth for them.

Source: Effects of the 2008–10 automotive industry crisis on the United States - Wikipedia
True but we're not in the middle of the biggest recession of all time (yet). There was no private capital to fund the GM and Chrysler bankruptcies. Right now I believe there would be. Companies go bankrupt all the time without going out of business.
It's clear what they mean is that they expect to enable those features later this year but with the requirement that the driver is paying attention with hands on the wheel at all times enforced by nag. Then only once they have improved the reliability and are able to show the data to regulators to prove that it is safer than a human will it have a chance of being approved for use without hands on the wheel at all times.

This is atleast for once a realistic strategy for Tesla. I don't expect we'll see hands free city driving for a few more years at the earliest.
I don't think their wording is clear at all but I agree that that is what they're planning to do. I also don't think what they're planning to do is legal in California but we'll see!
 
It’s gonna be kind of weird to have nav on autopilot working except for the part of stop lights etc...

I hope you are wrong but it’s obviously wishful thinking because I really don’t want to spend another $3k on my car, especially considering that they slashed the price across the board by $3k or something... and I already spent a fortune on my car.
Not really. Iirc, EAP was supposed to do on ramp to off ramp driving, which doesn’t really need stop sign or traffic light recognition. At least in my area, I can’t recall any freeways with traffic lights. There are a few that end by dumping out onto a traffic light, but EAP pulls to a stop at the end of off ramps right now, no reason it can’t at those traffic lights based on navigation data. AP was never really meant to be used on local roads.
 
I’m fine with Tesla’s stretch goals, and always missing deadlines. They continue to make progress, faster than others, and they eventually work out the kinks of their constant beta releases. The poll results show most expect them to miss the end of year date, but so what if drivers still have to pay attention for a couple more years? Its great they are even in a position to start talking about stopping for lights and other local road features in the next couple year and that our used cars will get those features. Fun to watch the process play out.
 
Something to keep in mind- Human drivers are truly awful in extremely basic situations. Even if FSD's only feature was forcing the car to stop at a red light, it would be instantly safer than the average driver. So, while I don't think FSD is going to immediately be revolutionary, the bar being set is very low.

It should be trivial to be more safe than a human on surface streets. Stop for stop signs, stop for red lights, stop for pedestrians. Those three things alone are an astonishing number of collisions every year.
 
Something to keep in mind- Human drivers are truly awful in extremely basic situations. Even if FSD's only feature was forcing the car to stop at a red light, it would be instantly safer than the average driver. So, while I don't think FSD is going to immediately be revolutionary, the bar being set is very low.

It should be trivial to be more safe than a human on surface streets. Stop for stop signs, stop for red lights, stop for pedestrians. Those three things alone are an astonishing number of collisions every year.
Yeah if FSD is implemented as a safety feature it would probably make the car safer. That's not what Tesla is proposing though. On its own autopilot is way less safe than a human driver. Luckily the consequences for using autopilot improperly are usually crashing into a firetruck, a jersey barrier, or a gore point. As far as I know Autopilot has never hurt any bystanders. Using "Automatically driving on city streets" mode improperly will result in blowing through a red light and killing someone. Also it's much more difficult for the "driver" to determine whether or not "Automatically driving on city streets" mode is going to stop for any given red light or not. I think there are many drivers who won't be able to realize that the system is not going to stop in time to correct it. I'm sure there's a million scenarios that we're not even thinking of. That's why regulations require that autonomous vehicle test drivers be trained.
 
Driving through Brooklyn & Manhattan today I was reminded of my going theory - FSD will work here about 5yrs after “city street FSD” claims to work elsewhere.

As it is, EAP + NoA on city highways (FDR, BQE, etc) is about a 50/50 proposition whereas once I get about 15mi outside the city its more like 95/5.
 
On its own autopilot is way less safe than a human driver. Luckily the consequences for using autopilot improperly are usually crashing into a firetruck, a jersey barrier, or a gore point.

These were all directly related to how radar imaging works, though. That's not a concern when using cameras to see places you need to stop.

Using "Automatically driving on city streets" mode improperly will result in blowing through a red light and killing someone.

That's the thing though. If it sees a red light with the camera, it sees a red light. There isn't ambiguity with the camera system generally speaking, and it doesn't rely on the light moving to be able to see it. This is one of the reasons that Doppler radar for driving is good as a backup but not a primary sensor. Though, I think Tesla's "cameras only" idea is a truly terrible one.

Also it's much more difficult for the "driver" to determine whether or not "Automatically driving on city streets" mode is going to stop for any given red light or not.

At a guess, it would behave like approaching slow traffic on the highway at the minimum. In reality we don't know how it behaves yet. Elon and the Tesla reps on earnings calls all say there have been major improvements to EAP alone, so we'll have to wait and see how much those improvements have improved the system overall.

I think there are many drivers who won't be able to realize that the system is not going to stop in time to correct it. I'm sure there's a million scenarios that we're not even thinking of. That's why regulations require that autonomous vehicle test drivers be trained.

Yeah, at the end of the day this system will be judged by its worst user, and we all know there are some truly terrible users out there. I have plenty concern about semi-autonomous modes on surface streets where reaction time and takeover time needs to be significantly quicker. But we haven't seen the system behaving yet, so we can't really say whether it will work or not.
 
That's the thing though. If it sees a red light with the camera, it sees a red light.
Is it really that simple though?

0B1tBd6.jpg
 
...what?

I mean, having a professional driver drive the car for you is "safer" too but they're not gonna include one of those free either.


EAP is explicitly not intended for use in places where stop signs and stop lights exist... it's only intended for divided highways without things like intersections.

So thinking that handling those things in EAP should be included in the feature is nonsensical.




Then don't buy it.



Uh, except most buyers did buy it... roughly 70-80% of them.

Tesla didn't "slash" autosteer- they removed a bunch of features from EAP and moved them to FSD, then repackaged the 2 oldest/most basic EAP features (TACC and autosteer) as a cheaper offering.

If you want 3 out of the 5 features in EAP anymore it'll cost you more with that move.
Great... let’s see though. So far, FSD has brought nothing... and some people bought it over 2 years ago, some on leases... some people got ripped off.

FSD is just a bet that so far hasn’t paid off at all...
 
Yeah, at the end of the day this system will be judged by its worst user, and we all know there are some truly terrible users out there. I have plenty concern about semi-autonomous modes on surface streets where reaction time and takeover time needs to be significantly quicker. But we haven't seen the system behaving yet, so we can't really say whether it will work or not.
Uber's self driving permits were suspended In Arizona and California after the fatal accident in Arizona. That accident was entirely avoidable if the trained test driver had been paying attention. I don't have all that much confidence that similar accidents won't occur with untrained Tesla drivers.
 
Is it really that simple though?

Don't get me wrong, I've worked in the computer industry for a long, long time. I'm skeptical actual full autonomous driving will ever happen at a level where it'll be acceptably safe to expose the public to. But generally speaking, identifying the traffic control device nearest to you (that picture had no depth element) and nearest to the direction you plan to travel (left-most light for left turn) is good enough for 90% of use cases.

Complex situations will demand takeover just like EAP does now. Again, fully autonomous? No. Driver assistance on surface streets, maybe.

Great... let’s see though. So far, FSD has brought nothing... and some people bought it over 2 years ago, some on leases... some people got ripped off.

FSD is just a bet that so far hasn’t paid off at all...

There haven't been FSD features yet, so there was nothing to release. Buyers knew that going into the purchase. It was a gamble to lock in pricing, which we all knew. Anybody buying FSD on a lease before this year was a fool, because at best they could get 2 years of FSD for the full expense, which is very low value.

Uber's self driving permits were suspended In Arizona and California after the fatal accident in Arizona. That accident was entirely avoidable if the trained test driver had been paying attention. I don't have all that much confidence that similar accidents won't occur with untrained Tesla drivers.

Uber's fatal accident was caused by Uber intentionally disabling their safety systems that prevented this exact type of collision. They severely F'ed up, and not nearly enough people are aware of what they did. I totally agree the operator should have been paying much more attention, no question, and that's really what I'm getting at when I say takeover times need to be much faster. And humans are absolutely terrible at taking over when they don't have full context of the emergency, which is why airplanes full of people crash due to pilot error.
 
  • Like
Reactions: Fiddler
Uber's fatal accident was caused by Uber intentionally disabling their safety systems that prevented this exact type of collision. They severely F'ed up, and not nearly enough people are aware of what they did. I totally agree the operator should have been paying much more attention, no question, and that's really what I'm getting at when I say takeover times need to be much faster. And humans are absolutely terrible at taking over when they don't have full context of the emergency, which is why airplanes full of people crash due to pilot error.
I'm having trouble understanding how a similar accident caused by a Tesla driver using "Automatically driving on city streets" mode would be perceived any differently by the public and regulators.
 
I’d buy the discounted FSD right now just for Hardware 3 alone. We’ve all seen how many of the software update features are limited to Hardware 2.5 or require it to fully function. Hate for a year down the line most updates require hardware 3 and I get half assed updates.
 
  • Like
Reactions: marstein
I'm having trouble understanding how a similar accident caused by a Tesla driver using "Automatically driving on city streets" mode would be perceived any differently by the public and regulators.

It wouldn't be. I'm suggesting it wouldn't be likely to happen, since the only way it happened in that Uber was for them to intentionally disable the safety systems.
 
The only difference between EAP and FSD is in maturity of the software and maybe HW3. Musk also says FSD is possible on HW 2.X but the software is more difficult. All the functions of FSD can be used anywhere now so long as there is an engaged driver without any regulatory approval. Driving with an unengaged or absent driver is dependent on many verification miles and regulatory approval, just like Musk says. Tesla will get to it quicker because of the hundreds of thousands of cars validating it. All FSD features will be here soon as Musk says. Use of FSD for level 3 and above will come when it comes. Why is this so hard to believe? The present quirks of EAP are no indication of how far the system has advanced in the company. If you believe FSD or autonomous can't be achieved without lidar, that's the big question. Musk is a software guy who also lands rockets on "Of Course I Still Love You" and "Just Read The Instructions" drone ships. If you want to bet against him, OK, not me.
 
  • Funny
Reactions: AlanSubie4Life
The only difference between EAP and FSD is in maturity of the software and maybe HW3. Musk also says FSD is possible on HW 2.X but the software is more difficult. All the functions of FSD can be used anywhere now so long as there is an engaged driver without any regulatory approval. Driving with an unengaged or absent driver is dependent on many verification miles and regulatory approval, just like Musk says. Tesla will get to it quicker because of the hundreds of thousands of cars validating it. All FSD features will be here soon as Musk says. Use of FSD for level 3 and above will come when it comes. Why is this so hard to believe? The present quirks of EAP are no indication of how far the system has advanced in the company. If you believe FSD or autonomous can't be achieved without lidar, that's the big question. Musk is a software guy who also lands rockets on "Of Course I Still Love You" and "Just Read The Instructions" drone ships. If you want to bet against him, OK, not me.
What's hard to believe is that regulators will not quickly ban "Automatically driving on city streets" mode after an accident like the Uber one.
Arguably an accident like the Uber one will become more likely as the technology improves. The initial version will probably be so terrifyingly bad that the driver will be in a constant state of abject terror unable to avert their eyes from the road. Haha.
 
Uber intentionally disabling their safety systems that prevented this exact type of collision

I thought they just disabled Volvo’s Mobileye system? Admittedly I haven’t been following the story. But anyway that would be a non factor in the crash since presumably they were replacing it with their own system? Maybe you were referring to another safety system (for driver attention?) they disabled.

The initial version will probably be so terrifyingly bad that the driver will be in a constant state of abject terror unable to avert their eyes from the road. Haha.

This state of affairs did not seem to be stopping some users of autopilot from averting their eyes from the road. That’s my concern (and I think yours).

Elon says it is all clear:

“I think where we're very clear with the you know when you when you buy the car what it's meant by full self driving it means it's feature complete, but feature complete
requiring supervision...”
“So we're just very quickly... there's really three steps this being feature complete []
self driving but requiring supervision, feature complete but not requiring supervision, and feature complete not requiring supervision and regulators agree."

So it seems like it will be just fine to have a feature-complete system requiring supervision, since they’re perfectly clear about the rules and limitations. What could go wrong? It’s better than some human drivers, so that means it will be awesome. More reliable than some human drivers, just make sure you supervise it until we get our billion miles!

Just thinking about traffic light detection and how that could possibly be safer than humans...I can’t wrap my head around it. Admittedly I have sailed through a red light accidentally late at night when mentally exhausted one time in my life, but even with that incident I can’t imagine a system (with today’s current state of the art) that would have done a better job than me over my lifetime. It’ll be interesting to see how it manages traffic lights that are off (and not blinking). I’ve probably encountered that situation about 50-100 times now lifetime with 100% success rate. FSD will have the GPS data of course. Should be a fun test! I certainly agree the system will do better than human drivers some of the time but no one cares about that statistic! It’s only when the system does worse than humans that matters, in the real world (even if using that metric in isolation means a worse overall result).
 
Last edited: