Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

MX crash on I-101 2018/03/23 (out of General)

This site may earn commission on affiliate links.
There is 0% chance there will be any regulatory action against Tesla. I'm sure NHTSA is extremely impressed that it's been nearly two years since the last autopilot death, and they will continue working with Tesla on how to make autopilot as safe as possible.

If the gore lane had been marked with yellow lines or chevrons as recommended by regulators, this accident wouldn't have happened. Bad lane markings and the fact that the crash attenuator was completely crushed will be found to be the cause of the accident and death.
 
This is why Tesla made it a point to remind its drivers to always have their hands on the wheel when AP is engaged. They have gone out of their way to make the car give warnings, and in some instances, would stop the vehicle if it doesn’t detect engagement from the driver. This is why Level 3 autonomous will detect your eyes, the Tesla is not at level 5 self driving yet, so hands on wheel is imperative.

Agreed. I found the 2017 MS manual here and on page 82 of the manual the warning explicitly states that Autosteer requires users to keep their hands on the steering at all times. I believe this would be applicable to Wei Huang's MX as well.

Screen Shot 2018-03-30 at 11.20.43 PM.png
 
Perhaps AP was following a car (rather than the lane lines - it was set to minimum distance so it may have locked onto the car rather than the lines?) which itself made an unsafe lane change between the HOV lanes and when it completed it's lane change it did so in a manner that the Tesla was unable to follow and so it got stranded in the middle of a non-existent lane from trying to follow the car ahead of it? Since the driver was (apparently) not paying attention the car just slammed into the wall ahead, thinking it was in a (perhaps too-small) lane between the lanes?
 
An Update on Last Week’s Accident

[snip]

Autopilot had navigated this stretch of road tens of thousands times so the failure this time must have been due to some unusual circumstances and I wonder what that might have been.

the stretch of road had changed when the attenuator was crushed by a Prius a week plus before? I’m sure other Tesla’s passed that spot in the 10 days or so that this was the case, but, perhaps this was the first time AP and the driver both failed to realize what was ahead.
 
I agree. As much as I appreciate Tesla, I feel this is a play on words. I would have been a lot more assured if the blog post said 'AP was showing warnings moments prior to the collision' but instead they say AP warnings were shown 'earlier in the drive'.

If AP was not actually showing visual or audible warnings moments prior to the collision, then I'm not sure why providing this 'earlier in the drive' information is relevant to the accident. AP could have shown a warning 5 km before the accident spot but that doesnt help Tesla's case. Also saying that the driver had 5 seconds of no hands on wheels may not help if AP never prompted the driver to put their hands on the wheel in those 5 seconds. Just my thoughts after reading that update.

Edit: However if the intent of the Tesla post was to convey that AP warnings were shown and for full 5 seconds driver didnt respond to the warnings before the accident, then this is a non-issue for the company. If that is the case then the post may have to be re-written to make it clear.

I agree that Tesla's word choice here is strange. I see two distinct ways of reading it. Were those 6 warnings immediately before the crash? Or minutes earlier? I'm admittedly biased to think the ladder, but I can see the former option too. I hope they get clarification, because we shouldn't be wondering what "earlier in the drive" means.

On the whole "the driver should have been driving" bull case, I get it. I acknowledge that Tesla has been hyper-communicative in stressing to drivers that they need drivers to still be alert with autopilot on. I feel bad for the victim, but I also think Tesla does a pretty good job at setting expectations for users.

The problem is that these events do impact public perception (push notification from the Wall Street Journal). In the long run, that's damaging to the industry. In the short run, it serves as a reminder to shareholders that "autopilot" does not mean "autonomous."

The optics are extremely bad. The victim had complained about autopilot malfunctions at that exact barrier. A big autopilot update had been pushed a few days prior. A guy called in to the official Tesla podcast and complained about a very similar issue with autopilot. The pictures of the crash aftermath are horrifying. All of this in the shadow of Elaine's passing.

Meanwhile, Google and GM have been aggressively pushing LIDAR as a standard. They've both had coordinated media pushes that are trying to brand LIDAR as a safe option. Both have been making moves to get level 4 autonomous vehicles on the road before 2020. They are going to fight hard for the perception that those spinning things on the top of a car are for safety.

I don't think there need to be legal, civil, or regulatory consequences for this to be damaging to TSLA.
 
Last edited:
Agree with you on this. But think Tesla knew quite quickly the data they are reporting now. They seemed to have reams of info on all the "safe" miles logged on this stretch. Friday, after close of business, is their preferred time for bad news.

May be, Tesla was waiting to analyze the control module that was shown to be recovered by CHP investigators yesterday..

The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

Not clear what Tesla is trying to say. Are they suggesting the driver should have taken control because there was a barrier ahead and visibility was clear for 150 m? Last week, Tesla said thousands of cars traveled the same route on autopilot. Now, it's trying to paint the driver negatively for getting nag warnings in the past? 6 seconds is well within the nag limit, so what's the issue? How would the driver know that AP will not handle that situation? If the drivers have to be 200% alert to take control at all instants, then what's the point?
 
  • Like
  • Funny
Reactions: Joelgjr and X Yes?
I agree that Tesla's word choice here is strange. I see two distinct ways of reading it. Were those 6 warnings immediately before the crash? Or minutes earlier? I'm admittedly biased to think the ladder, but I can see the former option too. I hope they get clarification, because we shouldn't be wondering what "earlier in the drive" means.

On the whole "the driver should have been driving" bull case, I get it. I acknowledge that Tesla has been hyper-communicative in stressing to drivers that they need drivers to still be alert with autopilot on. I feel bad for the victim, but I also think Tesla does a pretty good job at setting expectations for users.

The problem is that these events do impact public perception (push notification from the Wall Street Journal). In the long run, that's damaging to the industry. In the short run, it serves as a reminder to shareholders that "autopilot" does not mean "autonomous."

The optics are extremely bad. The victim had complained about autopilot malfunctions at that exact barrier. A big autopilot update had been pushed a few days prior. A guy called in to the official Tesla podcast and complained about a very similar issue with autopilot. All of this in the shadow of Elaine's passing.

Meanwhile, Google and GM have been aggressively pushing LIDAR as a standard. They've both had coordinated media pushes that are trying to brand LIDAR as a safe option. Both have been making moves to get level 4 autonomous vehicles on the road before 2020. They are going to fight hard for the perception that those spinning things on the top of a car are for safety.

I don't think there need to be legal, civil, or regulatory consequences for this to be damaging to TSLA.

Tesla made it pretty clear in their report:

The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider.

My belief is that drivers become negligent as Tesla’s AP software gets better. Every millisecond counts when you drove, you can’t react fast enough when your hands are off the wheel.
 
  • Like
Reactions: X Yes?
The optics are extremely bad. The victim had complained about autopilot malfunctions at that exact barrier. A big autopilot update had been pushed a few days prior. A guy called in to the official Tesla podcast and complained about a very similar issue with autopilot. All of this in the shadow of Elaine's passing.

Why did he continue to use autopilot at the exact location he knew it was malfunctioning? I just don't understand. What am i missing? If you know autopilot isn't working in a particular location, then why use it at that location? Shouldn't you at least be extra vigilant?
 
We all know (fear) how the media will play this one out. Sad.

Some late night thoughts before I go to sleep. The accident was already rumoured to be caused by autopilot, this confirms those rumours. My understanding is that some of the damage to the sp was already done. I see it dropping further but recalls happen, accidents happen. This is the nature of the auto business.

Now the more important point. We know autopilot can’t be perfect, a fatal accident was going to come sooner or later. A serious discussion needs to be had about choosing to use this technology to decrease accident and death rate. Hopefully people realize that is what this software does.
 
  • Like
Reactions: immunogold
Why did he continue to use autopilot at the exact location he knew it was malfunctioning? I just don't understand. What am i missing? If you know autopilot isn't working in a particular location, then why use it at that location? Shouldn't you at least be extra vigilant?

Especially when the AP is giving you a warning to be extra vigilant, “hands on steering wheel.” literally the interior lights would flash, a bell-like “ding” noise will sound, and reminders on the gauge screen would appear. I always have at least one hand on the wheel when operating AP, no exception.
 
May be, Tesla was waiting to analyze the control module that was shown to be recovered by CHP investigators yesterday..



Not clear what Tesla is trying to say. Are they suggesting the driver should have taken control because there was a barrier ahead and visibility was clear for 150 m? Last week, Tesla said thousands of cars traveled the same route on autopilot. Now, it's trying to paint the driver negatively for getting nag warnings in the past? 6 seconds is well within the nag limit, so what's the issue? How would the driver know that AP will not handle that situation? If the drivers have to be 200% alert to take control at all instants, then what's the point?
I'm not sure what Tesla is trying to say, but according to earlier reports, the driver had complained about autopilot trying to steer him towards that barrier on several previous occasions. If that's true, why would the driver trust autopilot to work properly at the time the crash occurred?
 
Some late night thoughts before I go to sleep. The accident was already rumoured to be caused by autopilot, this confirms those rumours. My understanding is that some of the damage to the sp was already done. I see it dropping further but recalls happen, accidents happen. This is the nature of the auto business.

Now the more important point. We know autopilot can’t be perfect, a fatal accident was going to come sooner or later. A serious discussion needs to be had about choosing to use this technology to decrease accident and death rate. Hopefully people realize that is what this software does.

The software is 1.4x safer than human drivers.
 
Why did he continue to use autopilot at the exact location he knew it was malfunctioning? I just don't understand. What am i missing? If you know autopilot isn't working in a particular location, then why use it at that location? Shouldn't you at least be extra vigilant?

You’re probably thinking of s grumpy customer who regretted buying Tesla and complained about every bug or issue. I think instead it was a tesla fan who really believed in the tesla mission that thought he identified a bug and passed it along to Tesla. He obviously trusted the car enough to get complacent.
 
I'm not sure what Tesla is trying to say, but according to earlier reports, the driver had complained about autopilot trying to steer him towards that barrier on several previous occasions. If that's true, why would the driver trust autopilot to work properly at the time the crash occurred?

Maybe he was distracted and didn't realize that he had gotten to the section of road that he has to disable it for...
 
The software is 1.4x safer than human drivers.
Everyone takes this statement as gospel but no one has ever questioned the methodology or data used to reach this conclusion. I think it's far past time we actually be open about the math used to get to this calculation.

As someone who has actually sat there from on-ramp to off-ramp for about 45 miles in Autopilot, albeit AP1, I've observed it's curious behaviors firsthand. Every time an off-ramp comes along I know the car is going to drift to the right as if getting off the freeway, and then suddenly detect the split because of the white line that indicates lane split and exit from freeway, and then the car jerks left to stay on the freeway instead of getting off. It does this every time at every off-ramp. No human driver would do this.
 
Last edited:
Why did he continue to use autopilot at the exact location he knew it was malfunctioning? I just don't understand.
Like I said, a big update to autopilot had happened days prior. Maybe he'd used it going to work a few days and thought the update fixed it? I don't know. But again, in the big picture, it's awful. How many drivers are having issues like Walter's? ABC News is airing a video of a guy demonstrating autopilot glitches. People are calling in to the official podcast to complain. It's going to make some people ask "is autopilot safe?"
 
  • Like
Reactions: X Yes?