Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Does auotpilot present a looming apocalypse for Tesla?

This site may earn commission on affiliate links.
Data about 90 mph stopping distances vary wildly. Link below seems to think that just braking alone on a dry road is 123M, with 1 second thinking distance it calculates 150m exactly. Stopping on a wet road would not happen in those conditions I think.

Tires make a huge difference (they are close to the only factor). The 125ft MXM4 stopping distance (60-0, goes to 2.25x the distance at 90mph) has been measured here.

A wet road would definitely be worse.
 
Tesla is already facing several lawsuits regarding those killed by AP running into stationary objects at highway speeds, and yes, as the fleet size increases these incidents will in all likelihood rise proportionately.



We don't know this at all and the first one to succeed will kick loose either a recall or an avalanche of further cases.
If Tesla loses a couple big cases then it's over for FSD or driver assist for EVERYONE. If Tesla is held liable because an idiot was sleeping and hit a tractor trailer, then there is no way that Cadillac or Waymo will keep offering anything similar.

In Germany 90 would be considered an annoyingly understated speed for the 3rd lane on an Autobahn, where 125mph (200kph) is more like the going rate.

These days owners are testing NoA there with v.2019.8.3 and I gather it is a pretty hairy experience, as I expected.
Why the heck would you want to be on autopilot at 125?
 
so designing a test that a human could just barely pass on a dry road and fail on a wet road is a bit of a stretch to expect the computer to beat.

The test would obviously have to be one that the given AV *can* pass in the conditions pertaining, not a Kobayashi Maru, though if APE "thinking time" should be 0.1s or less it should be expected to perform better than the average human who needs 1s for that.

The presumption that Tesla will allow FSD up to the same 90 mph limit as AP is questionable.

I think it is more likely than not, given NoA is already on general release in Germany at that speed, but we shall see.
 
Tesla is already facing several lawsuits regarding those killed by AP running into stationary objects at highway speeds, and yes, as the fleet size increases these incidents will in all likelihood rise proportionately.



We don't know this at all and the first one to succeed will kick loose either a recall or an avalanche of further cases.

For example, in the case of Walter Huang, whose AP2.5 Model X went into the gore-point at Mountain View last May, imagine a municipal worker had been repairing the pre-collapsed crash-attenuator at that moment and also been killed, his wife would IMHO have a fairly robust case against Tesla for its contributory negligence leading to the wrongful death, because:

1. The completely blameless 3rd party victim did not agree to participate in any Beta AP testing.
2. Tesla cannot say what level of attention was paid by deceased driver, as nothing actually measures that, just net torque on the steering. Huang could have had both hands balanced on the wheel with nothing detected, while being temporarily blinded by the low morning sun. i.e. he may have been paying reasonable attention but simply failed to brake due to over-trusting AP to make correct decisions.
3. AP followed the wrong line while accelerating across an area where it is not normally permissible to drive, straight into the GP, with no warning tones.
4. Tesla had 28 months since the first AP fatality to fix this known hazard [into stopped objects at highway speed] yet did nothing effective to resolve or safely mitigate it.
5. Musk over the same time greatly equivocated in public about the superior abilities of AP/FSD, tending to mislead the technically naïve as to the current capabilities in their vehicles.
6. Lawyers will preferentially pursue the defendant with deeper pockets, in this case Tesla/Musk.
7. If the plaintiff refuses to settle out of court under NDA then there is ~60% chance of convincing a jury that in this case Tesla's negligence amounts to a >=30% contribution to the wrongful death.
8. Given the current state of HW2.5 & v9 software, it is only a question of time until an analogous case presents in reality.
About two weeks ago a friend that is a non Tesla owner told me that Autopilot doesn't work. Calmly I questioned him why he had that idea. He said of friend of his that is partially blind couldn't trust his Tesla Autopilot (model S) because it couldn't handle work zones! I mean duh! Partially blind driver? I was speechless and figured it wasn't worth arguing. There will be people out there, mostly new Standard Model 3 owners, that really don't understand or appreciate that you the driver should always be in control with (or without) Autopilot.
 
If Tesla loses a couple big cases then it's over for FSD or driver assist for EVERYONE. If Tesla is held liable because an idiot was sleeping and hit a tractor trailer, then there is no way that Cadillac or Waymo will keep offering anything similar.

I don't know that it should affect other OEMs terribly much, as those aiming for >=L3 are in general much more careful with their rollout/advertising and tend to go for a comprehensive sensor package including LiDAR.

Why the heck would you want to be on autopilot at 125?

I certainly wouldn't: clarified above that AP/NoA only works up to 93mph in Germany. It is other drivers whose lane it might be changing into who can be closing at fierce speeds.
 
There will be people out there, mostly new Standard Model 3 owners, that really don't understand or appreciate that you the driver should always be in control with (or without) Autopilot.

As we move closer to FSD over the next few years, the distinction between it and AP will become quite fluid for many optimists, I should imagine, leading to plentiful smashes.

Tesla could and should do a lot better job in actively educating its customers how to not unnecessarily martyr themselves on the altar of tech, but that might cut into sales, so *cough* t'won't happen under current CEO *cough*.

Failing that fora such as this one can be of some help in curing new users of dangerous illusions they may be labouring under.
 
These are the type of people [along with MobilEye and the German OEM Auto-cartel] who will surely end up setting the AV Driving Test Tesla will struggle to pass in order to gain >=L3 approval:
Ford, GM, and Toyota team up to develop self-driving safety standards

To not get caught on the hop [i.e. regulated out of business] Tesla should therefore join all such initiatives ASAP.
I wouldn’t trust any test to evaluate a self driving system. How could you possibly make a test to prove that a system is safer than a human driver when human drivers get into accidents once every 150k miles? Will the test be a million miles long? How can you possibly get enough different situations in the test to mimic a million miles of driving?
 
I wouldn’t trust any test to evaluate a self driving system. How could you possibly make a test to prove that a system is safer than a human driver when human drivers get into accidents once every 150k miles? Will the test be a million miles long? How can you possibly get enough different situations in the test to mimic a million miles of driving?

Let's say the test is a month long and in 5,000 miles puts the AV through all the worst-case scenarios, under various weather conditions, which billions of miles of driving have identified as the minimum set of capabilities necessary to ensure it performs 10x better than the average human driver.

It would be like the type-approval process for current vehicles: once one car passes, all models with same sensor suite/software are approved for use up to that SAE level.

This does not mean FSD cannot be validated by billions of on-road miles in customers hands, as Tesla will do anyhow in order to get it into some shape worth putting forward for the official test.

What is clear is that:
1. Mere statistics under the control of the OEM concerned will never suffice to gain >=L3 approval for its vehicles, as it can by pure coincidence happen that several hazard scenarios did not occur in the fleet during the period covered.
2. Tesla should be inside the tent collaborating in setting such safety standards, rather than outside fixating on an heroic solo effort much more likely to ultimately fail to deliver what we all want to see, a safe and reliable >=L3 AV.
 
Regarding that 90mph cutout test: I tend to think that the current issue with that test is that the visual system cannot react fast enough to make the determination that there's a stationary car. We know that the radar defers to the visual system for situations like this. The current issue is that there's a lot of imagery to process from all the cameras, and the computing power just isn't good enough for the reaction times needed traveling at such high speeds.

HW3 essentially makes the visual system way more alert and responsive. So while I can't say if it can handle the cutout test at 90mph, theoretically, it should be able to handle speeds higher than what the current computer can handle.

I do think the limitation is in the processing of the incoming data, not the AI's decision-making itself.
 
Let's say the test is a month long and in 5,000 miles puts the AV through all the worst-case scenarios, under various weather conditions, which billions of miles of driving have identified as the minimum set of capabilities necessary to ensure it performs 10x better than the average human driver.

It would be like the type-approval process for current vehicles: once one car passes, all models with same sensor suite/software are approved for use up to that SAE level.

This does not mean FSD cannot be validated by billions of on-road miles in customers hands, as Tesla will do anyhow in order to get it into some shape worth putting forward for the official test.

What is clear is that:
1. Mere statistics under the control of the OEM concerned will never suffice to gain >=L3 approval for its vehicles, as it can by pure coincidence happen that several hazard scenarios did not occur in the fleet during the period covered.
2. Tesla should be inside the tent collaborating in setting such safety standards, rather than outside fixating on an heroic solo effort much more likely to ultimately fail to deliver what we all want to see, a safe and reliable >=L3 AV.
The problem I see is that much of the difficulty of achieving greater than human safety is that the system must predict the behavior of other road users. The complete test would require a huge number of other vehicles, pedestrians, and cyclists.
I still think it makes much more sense to test on public streets as many companies are already doing. I don’t think the risk of them cheating is very high at all as it would quickly become apparent when the system was released to the public. Model 3 drive enough miles in one day to show that the system is safer than a human a driver.
I do agree that Tesla should be more involved in setting standard and lobbying governments. Right now it costs $3275 to register an autonomous vehicle in California! I hope they can convince them to drop that before they release next year (haha).
 
The problem I see is that much of the difficulty of achieving greater than human safety is that the system must predict the behavior of other road users. The complete test would require a huge number of other vehicles, pedestrians, and cyclists.
I still think it makes much more sense to test on public streets as many companies are already doing. I don’t think the risk of them cheating is very high at all as it would quickly become apparent when the system was released to the public. Model 3 drive enough miles in one day to show that the system is safer than a human a driver.
I do agree that Tesla should be more involved in setting standard and lobbying governments. Right now it costs $3275 to register an autonomous vehicle in California! I hope they can convince them to drop that before they release next year (haha).
$3275 to register an autonomous vehicle in California, wow! I think the issue however isn't really about true autonomous vehicles (FSD in Tesla case) or even when or if that will ever happen and how to test for it.The issue at hand is, does Tesla in particular (assuming the less expensive Model 3 makes Tesla become the EV volume leader) have a liability because of the use of the term "Autopilot" and new drivers that might take that term more literally than they should?
 
$3275 to register an autonomous vehicle in California, wow! I think the issue however isn't really about true autonomous vehicles (FSD in Tesla case) or even when or if that will ever happen and how to test for it.The issue at hand is, does Tesla in particular (assuming the less expensive Model 3 makes Tesla become the EV volume leader) have a liability because of the use of the term "Autopilot" and new drivers that might take that term more literally than they should?
I wouldn't be surprised if there was a successful suit against Tesla. I'm not convinced that new users will be any more negligent than the old users. As I recall the first person killed in an autopilot crash was a huge Tesla fan with a youtube channel and everything. I feel like early adopters are actually more likely to trust technology than the general public.
 
  • Like
Reactions: OPRCE
In Germany 90 would be considered an annoyingly understated speed for the 3rd lane on an Autobahn, where 125mph (200kph) is more like the going rate.

These days owners are testing NoA there with v.2019.8.3 and I gather it is a pretty hairy experience, as I expected.
I don't know that it should affect other OEMs terribly much, as those aiming for >=L3 are in general much more careful with their rollout/advertising and tend to go for a comprehensive sensor package including LiDAR.
.
It's about liability. If the precedent is set that a user who misuses L3/L2 systems is not at fault and the auto maker is then it becomes super risky for anyone to offer those options.
 
It's about liability. If the precedent is set that a user who misuses L3/L2 systems is not at fault and the auto maker is then it becomes super risky for anyone to offer those options.

Waymo is going straight to L4 so drivers will not be a factor [after the testing phase with safety-drivers ends] and those at L2>L3, e.g. Cadillac CT6, use a highly foolproof Driver Attentiveness Monitoring System based on IR facial recognition in their SuperCruise system, so should be in a position to prove the driver's negligence in any L2 crash. Hence the chances of them being held liable for contributory negligence would IMHO be greatly reduced compared to the Tesla system based as it is on a bargain-basement sensor suite.
 
Waymo is going straight to L4 so drivers will not be a factor [after the testing phase with safety-drivers ends] and those at L2>L3, e.g. Cadillac CT6, use a highly foolproof Driver Attentiveness Monitoring System based on IR facial recognition in their SuperCruise system, so should be in a position to prove the driver's negligence in any L2 crash. Hence the chances of them being held liable for contributory negligence would IMHO be greatly reduced compared to the Tesla system based as it is on a bargain-basement sensor suite.
I think you are putting way to much faith in those systems being superior to Tesla's.
 
The more I use the AP, the less I like it.
AP feels like a new foreign person driving on the US highway for the first time.
Not smooth at all. It is interesting to be just watching but I would rather drive the car myself.
I don't know about suing Tesla for the name.
Automatic transmission still requires that I shift from P to D or R or N when needed.
I am not going to sue a car company after I crash into a carwash tunnel because the "AUTOMATIC TRANSMISSION" didn't shift to Neutral for me.