Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

MX crash on I-101 2018/03/23 (out of General)

This site may earn commission on affiliate links.
Why did he continue to use autopilot at the exact location he knew it was malfunctioning? I just don't understand. What am i missing? If you know autopilot isn't working in a particular location, then why use it at that location? Shouldn't you at least be extra vigilant?
To top this off, why set it to the shortest follow distance and not pay attention???

Unrelated note something I've noticed. If you followed the other two deaths, both were in situations where there was a leading car which changed out of the lane but tesla ap, not driver noticed instruction on the road and crashed. This seems somewhat close to that scenario, hard to tell at the moment.
 
Tesla's weasel wording does not say this, but they REALLY REALLY want everyone to interpret it this way.

No torque input was detected for 6 seconds. This remains entirely consistent with the driver's hands being on the wheel at all times and with the driver paying full attention no matter how much Tesla wants it not to be the case.

AP2 drove the car into this barrier. That's what Tesla has been forced to admit.

If the drivers hands were on the wheel sufficiently, AP couldn't torque it hard enough to force the car into the barrier.

Oops.

How do you want to play this Sherlock?
 
The crash sequence could easily have been that AP2 steered the driver into the crash barrier with a huge correction with no time whatsoever to react to it. Teslas DO NOT detect hands-on-wheel. They detect torque-on-wheel.

Huge corrections are not possible with sufficient hand placement.

I pass a semi on my right side with AP.

If my hand on is on the steering wheel at 3 o clock, I'm not going to correct into the Semi.

Try again.
 
I think its very important for people to remember that the reason this death happened was because the driver was not appropriately using Autopilot. Autopilot improves safety when used in conjunction with an engaged driver. It cannot replace the driver. Perhaps Tesla should rename to clarify expectations. Something like Autoassist or Wingman.

Lets contrast that to GMs ignition switch fiasco, where GM clearly had a defective product and hid this from the public for many years. 124 PEOPLE DIED while using the product as it was intended to be used.

Thing of the thousands of people who have collectively died because of exposure to VW diesels.

Statistically speaking, an engaged driver is much safer in a Tesla. A disengaged driver is not using the product correctly.

This will pass...
Umm, ok. How do you reconcile your statements with this video from Tesla? Remember, this was back in Oct 2016 when AP was only 0.2.

BTW, GM and VW both paid hefty fines, like billions, IIRC. Are you saying it is time for Tesla to pay up?

tesla_self_driving_video_claim.JPG


I'm surprised , this incident wasn't caught on anyone's dash cam or is not yet uploaded to the internet. Hopefully some video recording will surface soon .
 
Last edited:
  • Helpful
Reactions: TaoJones
Umm, ok. How do you reconcile your statements with this video from Tesla? Remember, this was back in Oct 2016 when AP was only 0.2.

BTW, GM and VW both paid hefty fines, like billions, IIRC. Are you saying it is time for Tesla to pay up?

View attachment 290971

I'm surprised , this incident wasn't caught on anyone's dash cam or is not yet uploaded to the internet. Hopefully some video recording will surface soon .

Hear hear to someone's dashcam capturing this and getting it posted. Also, did the phone survive? Could be something on the phone.
 
The angle looks to be fairly shallow so it’s hard to say that it would have been noticeable.

RIP Walter Huang
How the heck does a person hit this barrier on autopilot if your paying a little bit of attention? The lane lines aren’t bad, I still don’t get it, not unless someone does an experiment and gets AP to choose wrong lane and posts video.
 
If a convenience feature permits you to drive without watching the road, people will use it for not watching the road.

They shouldn't, but they will. We all know driving while watching our phone and goofing off with it is highly dangerous. How often do you see people doing it? It's illegal in many areas, well known to be potentially lethal, and that doesn't stop anyone who wants to do it.

Does anybody WANT to drive to work without watching out the windshield at all? You betcha. When you get high enough up the foodchain, you sit in the back of an executive sedan, not the front. It's one reason there will always be pressure for AV tech. Rich folk can fire their drivers and save big.
 
As someone who has actually sat there from on-ramp to off-ramp for about 45 miles in Autopilot, albeit AP1, I've observed it's curious behaviors firsthand. Every time an off-ramp comes along I know the car is going to drift to the right as if getting off the freeway, and then suddenly detect the split because of the white line that indicates lane split and exit from freeway, and then the car jerks left to stay on the freeway instead of getting off. It does this every time at every off-ramp. No human driver would do this.
As someone who actually owns an AP2.5 MX and previously rented an AP 1 MX for a road trip, I can tell you that AP 1 did behave exactly as you describe on my road trip, but AP2.5 did not do so until version 2018.10.4, which I received after over 7,000 miles of driving, using auto-steer most of the time it was available. Given that, the behavior you complain about is presumably caused by looking wider combined with lane centering. More specifically, prior to 2018.10.4, the right lane marking would disappear in the IC and auto-steer would follow the left marking, but now auto-steer centers between the two markings even though one is no longer marking the actual lane and corrects when the actual lane marking reappears.
 
I think there are two things that Tesla can do and probably should have done by now:

1. Make sure that auto-emergency braking works when an obstacle is in front of the car, regardless of the speed of obstacle or who is in control of the car. You can't caveat it by saying that it won't work if you do x, y or z. This is table stakes for any software that allows cars to auto-steer. I don't think its good enough to say that "we told you so in fine print" or that "the system is in Beta" or that "its a hard problem with false positives." Tesla claimed in 2016 that the cars have hardware to do full self driving. Detection of stationary obstacles should have been pretty high priority for the software.

2. If Tesla really believes that auto-pilot is in Beta and isn't saying that just to eliminate liability, it should prevent people from being able to take their hands of the wheel when auto-pilot is on. If the actual functionality is "lane-assist", have the system behave like lane-assist in setting expectations of the human driver. It practically doesn't matter that it can do 100 things better than a standard lane-assist, since the expectation from the human driver is the same, So figure out a way to enforce that expectation.
 
Other parts of the blog post confirm that Tesla cars with AP are saving many lives:

“Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.”

An Update on Last Week’s Accident

The problem with this comparison is that the overall US number includes teenagers, elderly and drunk drivers. Those people aren't driving 100k Teslas.
Tesla drivers tend to be 35-50yr old men who make 290k per year. To put it in perspective, the Volvo XC90 is a vehicle that often has 0 accident deaths
for the year. It is all about the demographic. Tesla with 'autopilot' is safe, but it isn't 'several times safer' as it is often portrayed.
 
  • Helpful
Reactions: SlicedBr3ad