Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD on city streets coming later this year with reliability far in excess of human drivers!

Really? :)

  • Yes

    Votes: 37 18.7%
  • No

    Votes: 161 81.3%

  • Total voters
    198
This site may earn commission on affiliate links.
But anyway that would be a non factor in the crash since presumably they were replacing it with their own system? Maybe you were referring to another safety system (for driver attention?) they disabled.

You'd hope I was, right? But no. They disabled their own software's collision avoidance because it was too timid in real world situations, and kept just stopping on the road. So, they shut it off and hit a pedestrian. They onboard Uber system detected the object in the road with more than enough room to stop, it confidently identified that the object was a person will something like 10-15 seconds of lead time, and the operator saw the person a couple seconds before impact.

You can read the whole ridiculous situation in the NTSB preliminary report. Preliminary Report Released for Crash Involving Pedestrian, Uber Technologies, Inc., Test Vehicle

According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

I can’t imagine a system (with today’s current state of the art) that would have done a better job than me over my lifetime.

Yeah, but it's possible, and please don't let this go to your head :D, that you're not the average driver. The NHTSA crashstats report about intersection collisions is somewhat interesting, but I think most of us already know the results. https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811366

People misjudge the closing speed of other cars, they don't see other cars, they don't notice cars in front of them are stopped while they're looking away, etc. So traffic control device identification, combined with the object detection and ranging that the car already does fairly well could likely significantly reduce several categories of these collisions. We don't know how it will work yet, so that's all speculation, but this is Internet. We're here to give the hottest of takes.
 
You'd hope I was, right? But no. They disabled their own software's collision avoidance because it was too timid in real world situations, and kept just stopping on the road. So, they shut it off and hit a pedestrian. They onboard Uber system detected the object in the road with more than enough room to stop, it confidently identified that the object was a person will something like 10-15 seconds of lead time, and the operator saw the person a couple seconds before impact.

You can read the whole ridiculous situation in the NTSB preliminary report. Preliminary Report Released for Crash Involving Pedestrian, Uber Technologies, Inc., Test Vehicle





Yeah, but it's possible, and please don't let this go to your head :D, that you're not the average driver. The NHTSA crashstats report about intersection collisions is somewhat interesting, but I think most of us already know the results. https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811366

People misjudge the closing speed of other cars, they don't see other cars, they don't notice cars in front of them are stopped while they're looking away, etc. So traffic control device identification, combined with the object detection and ranging that the car already does fairly well could likely significantly reduce several categories of these collisions. We don't know how it will work yet, so that's all speculation, but this is Internet. We're here to give the hottest of takes.
Approximately 2.98 trillion miles were driven in 2008 and there were 2.3 million accidents at intersections according to your link. So that's an intersection accident every 1.3 million miles. I'm sure Tesla will be at least that good by the end of the year, right?
 
Approximately 2.98 trillion miles were driven in 2008 and there were 2.3 million accidents at intersections according to your link. So that's an intersection accident every 1.3 million miles. I'm sure Tesla will be at least that good by the end of the year, right?

Only one way to find out. Put a bunch of inattentive yahoos behind the wheel and tell them the car drives itself. See what happens.
 
Well I hope they do it another state. haha.
Live free or die?

I hope they do too, I think.

I just feel like Tesla is marching towards some inevitable and awful “bad press” incidents (and worse, probably someone will die). I think it is great they they are trying to develop these features, and there must be some way to implement them in a way that actually makes driving safer and easier. Just not sure whether the way they are approaching it is the best. I don’t have any suggestions.

Also I am not completely sure that true full self-driving is possible, at least within the next 5-10 years. I just feel like 70% of risky situations which can be mitigated on the road involve accurate prediction of aberrant human behavior before anything actually happens (aka anticipatory defensive driving), and not sure computers will be up to the task on that front for a while.

Personally, I just want something that reliably stops me from running into something I didn’t notice (and maybe attempts to stop me from running red lights (and similar actions) though not sure how that would work - and it’s really not a problem I have), but generally operates in the background. I’m cool with generally doing all the driving. Just make sure I don’t drift off the road or run into someone in front of me, or sideswipe someone, and prepare me for an imminent rear end collision. And take decisive evasive action in response to things I don’t notice. Maybe?

For the freeway stuff the NoA needs some minor work, but generally is pretty good for my purposes already, as long as I don’t mind piling into road debris and animals and pedestrians on the freeway and such. I just have to be vigilant and drive defensively. So that’s good, I guess. Seems like with some minor tweaks to not make it drive like an idiot in Autopilot, all would be mostly well on that front.
 
  • Love
Reactions: Pdubs
So, they shut it off and hit a pedestrian.

I see.
Shocking. System noticed 6 seconds before. The most amazing thing is that in addition to the emergency maneuvers being deactivated, the system was not designed to alert the driver about detected obstacles. Maybe also related to the being too timid and “crying wolf” too much?

I think Uber’s approach in this incident may be worse than Tesla’s approach. :) I guess we will see.
 
  • Like
Reactions: DrDabbles
The Uber fatality was almost a suicide walking a bike across an unlit thoroughfare at night. There are over 100 people killed in automobiles in the US every day. Musk's view is that any technology that reduces that should be implemented even if it occasionally causes a fatality. I agree. Auto deaths are not old people in hospice, nursing homes, hospitals, etc., they are younger people many in their prime whose lives are instantly snuffed out with catastrophic consequences for anybody connected with them. The view that driving assists should not be used/permitted until they are 100% safe is immoral.

People are occasional killed in Tesla accidents. We hear about every one. The NTSB is prodded to investigate each one like a 150 fatality airliner crash.

Always remember, Tesla has the most powerful enemies any company has ever faced; the fossil fuel industry, the legacy car manufacturers and dealers. There are dozens of lesser ones like brake pad manufacturers, quick oil change shops, the collision repair industry and all the myriad businesses that flourish in the shadow of our road carnage. How have we ever come to accept 100 violent deaths a day from our pride and joy transportation machines?
 
The Uber fatality was almost a suicide walking a bike across an unlit thoroughfare at night.
I suggest you read the NTSB report or Wikipedia page. Or watch a video of the area from a decent camera unlike the extremely misleading one that Uber released.
The view that driving assists should not be used/permitted until they are 100% safe is immoral.
Then Tesla should focus on driving assists that prevent collisions. Every other company is focusing on collision avoidance for their production cars. There are plenty of ways that Tesla hardware could be used to avoid accidents. Right now it won't even stop you from running into the side of a semi truck. I don't believe that implementing a "fake" self driving system will make us safer.
 
What's hard to believe is that regulators will not quickly ban "Automatically driving on city streets" mode after an accident like the Uber one.

Regulators are not going to ban FSD because there is a fatal accident!

There are going to be fatal accidents whether humans or computers are driving the cars (or a mix of both).

The relevant statistic that regulators will be looking at is which kills less people, which maims fewer people and which causes less property damage. And as @tomc603 already pointed out, the bar is set really low.:cool:
 
The relevant statistic that regulators will be looking at is which kills less people

I mostly agree that should be the metric, but not sure whether that's what the regulators will be doing. Politically and emotionally, no one will really care about situations where the car does better than humans (and saves lives). They're only going to care about when the car fails in some obvious way where a human never would. Regulators might have their hand forced.

I'm also fairly sure that in current & proposed (for the next year or two) form, left unattended, the proposed system will be substantially more dangerous than human drivers. When tended to by an average driver, it's hard to say whether it will be more or less dangerous. When diligently attended by a driver expecting the system to fail, it *might* be safer than a human alone. But not sure.

But as suggested above, it would probably be safer still if the focus was on designing a system that prevents accidents rather than driving for you. Might be better to market it that way too. Tesla's got plenty going for them, I'm not even sure they need to push their self-driving stuff so hard. It's high risk high reward I guess. I can see the attraction of Autopilot in current form, but do a lot of people really want their car to drive itself around surface streets? Anyway, it might be higher reward (and sooner!) to be shipping that Model Y.
 
Last edited:
The relevant statistic that regulators will be looking at is which kills less people, which maims fewer people and which causes less property damage. And as @tomc603 already pointed out, the bar is set really low.:cool:

I agree, but I also don't. I think the reality we need to accept is that robots are inherently scary to people, even if they're significantly better. Regulators tend to overreact given even a minor rumble with their constituency, and in the case of "AI" controlled robot cars I'm not sure I would blame them.

I've been in the computer industry way too long to trust autopilot. But there are a shocking number of people that seem to think EAP is magic and it'll just work no matter how many examples we have of it misbehaving. The saving grace of EAP is that it's human augmentation instead of replacement. That simple fact means Tesla can, though they might choose not to, train their system silently in the background even on cars that don't have EAP enabled for use. Letting the computer predict what its action should be, as it normally would, but the human is completely in control and blindly giving positive or negative reinforcement to the computer's choices. We do know that in cars with EAP enabled, any take-over action flags the event and sends it to Tesla for training, so this isn't too far of a stretch IMO.

At the end of the day, as Elon noted in the ARK Invest inverview, people are going to die. The point is to march along and lower the probability of it happening. I don't see full autonomy (level 4 or 5) generally allowed by legislators for 10 years, if I'm being honest. And level 3 autonomy, I'd be legitimately shocked if it's allowed within 5 years. There's just too much development that still needs to happen, testing frameworks need to be created, validation systems need development, legal liability needs to be decided, and so on. And that's assuming we could just flip a switch tomorrow and cars had a system capable of full autonomy.
 
  • Like
Reactions: AlanSubie4Life
A lot of the automation nay-sayers are going to be shocked at how capable these systems become and how quickly they become more competent than your average driver because they don't understand how neural nets work and learn. Even after approved for FSD, it will still be safer to actively monitor them rather than go to sleep in the back seat and I plan to monitor (I don't even like it when someone else is driving the car I'm in). However, I will be relieved when they are in widespread use in others cars because that will mean they are better than the current status quo of human drivers. That's all I have to say about that right now. Let's revisit this in three years because I think you will be amazed at how much progress has been made.
 
I don't see full autonomy (level 4 or 5) generally allowed by legislators for 10 years, if I'm being honest. And level 3 autonomy, I'd be legitimately shocked if it's allowed within 5 years.
Level 5 autonomy is already legal in California. Here's the form to register your level 5 vehicle: https://www.dmv.ca.gov/portal/wcm/c...89c5-b2bc7de3fd2c/ol321.pdf?MOD=AJPERES&CVID=
Many companies have permits to test autonomous vehicles in California, including Tesla, though they claim they haven't done any testing here since 2016.
Waymo has a permit to test autonomous vehicles without a human test driver (a remote operator can take over when the system finds a fault) but have not yet begun to do so as far as I have heard.
Audi is deploying a Level 3 system in Europe. Tesla should really be working on a similar system to Audi's traffic jam assist as it would actually be useful to many drivers here.
Regulators are not going to ban FSD because there is a fatal accident!
They pulled Uber's permit to test in Arizona and California after a single fatal accident. I think a similar accident caused by an inattentive driver using "Automatically driving on city streets" mode would get a similar response.
 
This thread title is misleading, and I may be repeating things, but let's look at the verbiage on the feature text next to purchasing FSD, I've added bolding & underlining for emphasis (my emphasis). So we are MANY, MANY years away and part of Tesla's testing group. While maybe worthwhile, we are paying heavy sums to build/prove this out for Tesla and other manufacturers and regulators, do not expect this stuff is going to be prime time ready this year or event next:

"The current features require active driver supervision and do not make the vehicle autonomous. The future use of these features without supervision is dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving features evolve, your car will be continuously upgraded through over-the-air software updates."

Regulatory approval is huge and so is billions and billions of miles. There is no metric or stated time of "achieving reliability far in excess of human drivers", it's a major uphill and this is a good start, but when you spend your money, be aware of the larger process and how you're sort of investing in this build out and providing free beta testing at your expense, some people are into that, but as Tesla appeals to a larger consumer base, some of this you may not be used to as Toyota, BMW, Audi don't do anything like this (charge customers for early access, pre-regulation features).
 
Audi is deploying a Level 3 system in Europe. Tesla should really be working on a similar system to Audi's traffic jam assist as it would actually be useful to many drivers here..


Remove the "supervision" warnings/checks from EAP and you've got a better one than Audi offers already, let alone with the HW3 improvements.



Audis system-

On divided highways, under 37.3 mph, with busy traffic, you can engage it. (so much much fewer places than EAP can be used)

The car will drive for you, hands-off. (which EAP will too if they removed the steering wheel checks)

A camera will insure you're still awake and responsive though (which model 3 could also do if they turn on the interior camera- they just use the steering wheel check instead, so this task is already covered in a different way)

If speeds go above 37.3 or the line of vehicles in traffic breaks up, the car will prompt you to take back over (EAP on the other hand will keep working fine on said divided highway at higher speeds, and regardless of amount of traffic).

If you ignore it the car will eventually brake to a stop. (just like EAP does)
 
Most Tesla drivers find the AP features very useful already, even in their current state. Major improvements are in the pipeline.

Like I previously stated, I'm the kind of person that will still watch over the automation even after it's approved for FSD. Even when over-seeing the AP the benefits are substantial in terms of less driver fatigue and increased safety. Arriving at your destination feeling fresh and unstrained is priceless after a long drive.
 
  • Like
Reactions: 5_+JqckQttqck
This thread title is misleading
Only because Tesla's wording is so vague. Your wording is much more clear! You should be Tesla's copy editor :p I would make one more change: "The current features and those coming later this year require active driver supervision and do not make the vehicle autonomous. The vehicle will not be autonomous for the foreseeable future."
Regulatory approval is huge and so is billions and billions of miles. There is no metric or stated time of "achieving reliability far in excess of human drivers", it's a major uphill and this is a good start, but when you spend your money, be aware of the larger process and how you're sort of investing in this build out and providing free beta testing at your expense, some people are into that, but as Tesla appeals to a larger consumer base, some of this you may not be used to as Toyota, BMW, Audi don't do anything like this (charge customers for early access, pre-regulation features).
It annoys me how much Tesla talks about regulatory approval without saying specifically what they're talking about. It's like saying they aren't releasing the Model Y because of regulatory approval. While it is true that the Model Y is not currently legal that is not the reason they haven't started selling them. Waymo has regulatory approval to test autonomous vehicles without a test driver, why doesn't Tesla? Because their technology is not yet advanced enough.
 
Audi is deploying a Level 3 system in Europe. Tesla should really be working on a similar system to Audi's traffic jam assist as it would actually be useful to many drivers here.

Haha! LOL! Audi could learn a thing or two from Tesla's EAP which has been helping Tesla drivers through traffic jams for years! The clue that it's not as advanced as Tesla's AP is that it automatically turns itself off at 38 mph! Tesla's works up to 90 mph!

They pulled Uber's permit to test in Arizona and California after a single fatal accident.

That was the proper response to what I consider manslaughter. There was a problem with their emergency braking system so their engineers turned it off so they could show upper management how good their development work was! Someone should go to jail for that. You don't run an autonomous car with the emergency braking disabled!
 
  • Like
Reactions: Fiddler and Chedawg
Remove the "supervision" warnings/checks from EAP and you've got a better one than Audi offers already, let alone with the HW3 improvements.



Audis system-

On divided highways, under 37.3 mph, with busy traffic, you can engage it. (so much much fewer places than EAP can be used)

The car will drive for you, hands-off. (which EAP will too if they removed the steering wheel checks)

A camera will insure you're still awake and responsive though (which model 3 could also do if they turn on the interior camera- they just use the steering wheel check instead, so this task is already covered in a different way)

If speeds go above 37.3 or the line of vehicles in traffic breaks up, the car will prompt you to take back over (EAP on the other hand will keep working fine on said divided highway at higher speeds, and regardless of amount of traffic).

If you ignore it the car will eventually brake to a stop. (just like EAP does)
I guess it's actually "traffic jam pilot" that I'm talking about. They claim it's a Level 3 system and you can read a book, browse the internet, etc. while it's on. So maybe you have to be awake but you don't have to pay attention. Also, Audi is liable for accidents that occur while the system is operating. That sounds like a feature that many Tesla owners would like! Unfortunately plenty of people have commutes that are below 37mph.
 
  • Like
Reactions: S4WRXTTCS