Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

MX crash on I-101 2018/03/23 (out of General)

This site may earn commission on affiliate links.
Everyone takes this statement as gospel but no one has ever questioned the methodology or data used to reach this conclusion. I think it's far past time we actually be open about the math used to get to this calculation.

As someone who has actually sat there from on-ramp to off-ramp for about 45 miles in Autopilot, albeit AP1, I've observed it's curious behaviors firsthand. Every time an off-ramp comes along I know the car is going to drift to the right as if getting off the freeway, and then suddenly detect the split because of the white line that indicates lane split and exit from freeway, and then the car jerks left to stay on the freeway instead of getting off. It does this every time at every off-ramp. No human driver would do this.

It would be very, very easy for Tesla to provide proof to the public of the claim. They have both AP cars in service and non-AP cars in service. Is there really a 40% difference in collisions? This takes the demographics out of the equation, and the safety of the cars themselves, and the uneven distribution of collisions based on region.
 
  • Love
Reactions: replicant
I'm not sure what Tesla is trying to say, but according to earlier reports, the driver had complained about autopilot trying to steer him towards that barrier on several previous occasions. If that's true, why would the driver trust autopilot to work properly at the time the crash occurred?

We don't yet know this for sure. His brother has said this. But Tesla has denied it. Whom should we believe?
But we know, visibility was good (150 m) and AP was engaged (Tesla), but neither AEB nor AP avoided the crash. it is also possible AP veered into the barrier for some reason.
I-TEAM EXCLUSIVE: Victim who died in Tesla crash had complained about Autopilot
pic1.JPG
 
Like I said, a big update to autopilot had happened days prior. Maybe he'd used it going to work a few days and thought the update fixed it? I don't know. But again, in the big picture, it's awful. How many drivers are having issues like Walter's? ABC News is airing a video of a guy demonstrating autopilot glitches. People are calling in to the official podcast to complain. It's going to make some people ask "is autopilot safe?"
“Official” podcast? Please enlighten me, I know of no “official” podcast.
 
Tesla has confirmed that autopilot was engaged during this weeks crash. I suspect a red day Monday.
Maybe, but the stock has already been hit pretty hard partly as a result of this accident. I'm sure algobots jumped on the negative headline about this crash but investors who actively sold stock based on this undoubtedly did so assuming autopilot was involved. This latest statement just confirms that. I would guess this story now has much less influence on the stock than model 3 production will. If production is a miss though, this will still further add to the negative sentiment. If production guidance is not a miss, this story probably won't have much effect.

I am worried about the potential for autopilot to be temporarily halted due to the perception that it is dangerous in the hands of distracted drivers. I use it every day and would be very disappointed if I couldn't use it. You obviously have to remain vigilant at all times while using autopilot. I totally understand that, as do most Tesla drivers. However, there are going to be an increasing number of drivers who use the technology and don't fully realize that autopilot can't be trusted to drive the vehicle without vigilance. Somehow this happened to Walter, who evidently was aware of autopilot's weaknesses and flaws, yet did not remain vigilant while utilizing the technology in an area he already knew to be unusual and potentially problematic for autopilot. Statistically, this should not result in autopilot being temporarily halted. In terms of perception from the perspective of the agencies investigating, they may feel compelled to do something to show that they are going to actively try to prevent more deaths like this from use of the technology.
 
  • Helpful
Reactions: immunogold
I'm not sure what Tesla is trying to say, but according to earlier reports, the driver had complained about autopilot trying to steer him towards that barrier on several previous occasions. If that's true, why would the driver trust autopilot to work properly at the time the crash occurred?
The most obvious answer would be distracted driving. Here is a person who knew firsthand of the dangers/weakness of autopilot, yet he still allowed himself to become distracted at the worst possible moment, and it cost him his life.
 
Couple of points:

1. The cause of the crash, as most are, was simply: driver wasn't paying attention.

2. This isn't a Tesla thing. Many cars are now getting these driver assist line-following cruise controls.

3. These systems do work well, I use mine all the time, but I always pay attention. I still "drive".


I was curious about what happened in this case.

I'm pretty sure this is the spot of where the crash happened. Barrier is lower right.

You can see this is exactly the type of area where if you are using a line-following assist system, you probably want to disengage, or at least, pay attention.

One of the problems I've seen is that in area's like this, when lines are maintained well, these systems can handle. But, based on this image, and thinking that probably since this was taken the lines are even more faded, it is easy to see "bad or gone" lines causing the car to follow the wrong lines.

IMG_0231.jpg



What are my take-aways?

1. What I say to myself each time I turn it on "Pay attention. You are still driving."
2. This isn't a Tesla thing, this is a driver assist thing that most cars will be getting
3. Road maintenance is going to become more of an issue in the future. Faded, gone lines, just won't cut it anymore.
4. These assist systems, with people paying attention, is still is better than no systems, and people still not paying attention.


I do feel incredibly bad for the driver. I'd love to stand on my high horse and say "I can't believe he wasn't paying attention!!!" but in this day and age, with devices blinking and booping and competing for our attention everywhere, I can't totally blame him. It's not easy navigating in this ultra-connected world.

So yes it was his fault, he was the driver, but I can see how it was just a bad accident.
 
Last edited:
I think its very important for people to remember that the reason this death happened was because the driver was not appropriately using Autopilot. Autopilot improves safety when used in conjunction with an engaged driver. It cannot replace the driver. Perhaps Tesla should rename to clarify expectations. Something like Autoassist or Wingman.

Lets contrast that to GMs ignition switch fiasco, where GM clearly had a defective product and hid this from the public for many years. 124 PEOPLE DIED while using the product as it was intended to be used.

Thing of the thousands of people who have collectively died because of exposure to VW diesels.

Statistically speaking, an engaged driver is much safer in a Tesla. A disengaged driver is not using the product correctly.

This will pass...
 
Autopilot might be suspended. This is a real possibility. We are at a crossroads. For the first time I think it's a coin flip whether tesla stays public or goes private. Sounds crazy but a Chinese company could partner with elon. Next week will be brutal even if we hit 2500k.
I take back this text. I'd call it drunk texting but I wasn't drinking. Stupid texting is more like it.
 
Was this with AP1 or 2.5?
I think the more pertinent question is whether this was the latest software version which was a significant change from the previous version of EAP and being only a few weeks old. I don't recall there being many if any AP1 MX.

When Tesla quote numbers they need to be meaningful, Teslas with significantly different code may as well be a different make. Accident rates need to be for similar 2 ton cars built to modern crash standards. Its easy to pick numbers that sound great.
 
Tesla’s wording states that the driver’s hands were not detected on wheels for a period of time. Can they really detect driver’s hands on wheels? Or whether you have wiggle the wheel to disable the warning?
Tesla's weasel wording does not say this, but they REALLY REALLY want everyone to interpret it this way.

No torque input was detected for 6 seconds. This remains entirely consistent with the driver's hands being on the wheel at all times and with the driver paying full attention no matter how much Tesla wants it not to be the case.

AP2 drove the car into this barrier. That's what Tesla has been forced to admit.
 
oh-my-god!
Are you saying the drive committed suicide?
Purposely crashed?

There is no other option to your statement. Seriously rude.
Absolutely not what I was stating.

The crash sequence could easily have been that AP2 steered the driver into the crash barrier with a huge correction with no time whatsoever to react to it. Teslas DO NOT detect hands-on-wheel. They detect torque-on-wheel.

Tesla wants you to think the driver was distracted and didn't have his hands on the steering wheel to deflect blame away from AP2. They do this by careful word selection and the introduction of irrelevant other facts in their press release.

The cause of this fatality will likely ultimately be found to be multi-factorial and we owe it to the life of the driver to make sure that safety improvements are made so no one else has to lose their precious life in such an accident sequence.
 
  • Like
Reactions: alcibiades
Absolutely not what I was stating.

The crash sequence could easily have been that AP2 steered the driver into the crash barrier....

OK. I get your thought process. You are saying the car was in the proper lane. The driver was fully aware hands on wheel and the Model X suddenly jerked extremely hard and hit the barrier before anyone could possibly have had a chance to react. I've never read of that hard jerk steering happening but I don't follow the AP stuff that much. That is plausible.

I personally highly doubt it. That scenario will be obvious one way or another when the investigation is complete. If it is the case then I can see the concern because I would want autopilot pulled if it was doing that. That would have to be worse than a TV show style tire blowout.
 
oh-my-god!
Are you saying the drive committed suicide?
Purposely crashed?

There is no other option to your statement. Seriously rude.

I would have said suicide was an extremely small possibility prior to the comment by the brother. I still don't think suicide is likely, however, if he truly complained about that exact location 7 times to his brother, suicide is absolutely a possibility.

My theory is that he spoke to his brother about autopilot not being consistent or taking exits when it shouldn't.... I doubt he mentioned this at that spot. Suicide is doubtful for many reasons, but perhaps most important is that the crash barrier was gone... he drove the one of the safest personal vehicles on the road .... he had much future oriented behavior, and was not rumored to have struggled with depression or anything else.

Nope, right now it's looking like an Autopilot mistake combined with an inattentive driver.

It's interesting too, on one hand, the press release was very good from Tesla.... saying it was a devastating event.... on the other hand, I too am starting to wonder about the way these accidents are framed.... torque on wheel versus hands on wheel are important.... I get several warnings per drive that I need to put my hands on the wheel.... and my hands are always on the wheel when this occurs. Sometimes I even get beeped at be cause I don't notice the warning, despite my hand being on the wheel.
 
  • Helpful
Reactions: SlicedBr3ad