Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Are you saying Tesla lied? Yes/No?

Tesla didn’t lie, but they did craft the wording of that post to get the headlines that resulted. Very clever of them really.

How was the fact that he had gotten warnings at some point earlier in the drive (no timeline was given for that part) or that the follow distance was set to 1 relevant to the 6 critical seconds? I get warnings fairly often when my hands are on the wheel because I am holding it too lightly, I think a lot of folks here experience that. And that follow distance mention - they don’t explain that. Was the car following another vehicle? They say he had a clear view for 5 seconds, so the answer appears to be no. So, the only reason to mention that is to paint a narrative about the driver.
 
The car is a 100D, the chance of getting one as CPO and/or AP1 is slim. The window is narrow, ie. AP2 went into production 2~3 months after 100D did if memory serves me right.
AP2 was Q4 2016 and 100D was January 15th 2017, it’s an impossibility. It’s probable he had AP2.5 hardware released Q3 2017

The statement from Tesla is extremely misleading. I get warned visually and I don’t see it because my eyes are on the road. When I get an auditory que I realize I wasn’t putting enough torque on the steering wheel with my hand despite my hand being on the steering wheel. This explanation is poor.

He very well could have been distracted but it’s misleading
 
The car is a 100D, the chance of getting one as CPO and/or AP1 is slim. The window is narrow, ie. AP2 went into production 2~3 months after 100D did if memory serves me right.

There are no AP1 100Ds. Vehicles manufactured in October of 2016 was AP2.

Tesla didn’t lie, but they did craft the wording of that post to get the headlines that resulted. Very clever of them really.

How was the fact that he had gotten warnings at some point earlier in the drive (no timeline was given for that part) or that the follow distance was set to 1 relevant to the 6 critical seconds? I get warnings fairly often when my hands are on the wheel because I am holding it too lightly, I think a lot of folks here experience that. And that follow distance mention - they don’t explain that. Was the car following another vehicle? They say he had a clear view for 5 seconds, so the answer appears to be no. So, the only reason to mention that is to paint a narrative about the driver.

I agree with you that I can't find an explanation on how follow distance would have mattered?

First impression this is how I see best/worst case scenario for Tesla:

Best Case: Driver fell asleep, or on his phone.

Worst Case: Walter thought AP was doing fine and counted on it following some line to not hit a barrier in front of him. It didnt and went straight instead of left/right.

My "worst case" needs commentary from someone else who knew exactly how the car traveled before the collision.

It's complicated.. its Monday quarterbacking but AP is not suitable for all scenarios. Messing with strangely defined gore points is one of those scenarios.
 
Tesla's post tonight confirms my feeling all along that Autopilot led Walter's X straight into the barrier. I think Tesla's statement was carefully crafted to reveal the facts while also being intentionally vague about the timing of warnings. I think it's likely that Walter was distracted and never saw it coming and wasn't warned in the seconds leading up to the crash.

I'm not posting here to point blame at Tesla or AP. Walter was a huge fan of Autopilot, but I believe he understood its limitations. Knowing Walter, I think if he could chime in here on this thread he would accept responsibility for this tragic accident.

The best thing that can come out of this now is that we learn something and move forward. I'm still very excited about the future of autonomous driving and I know that Walter would be proud if he were able to contribute to that future in a positive way.

It's remarkable to me that Tesla was able to retrieve the logs from the car. I'm sure that they have a team digging deep into that data already. I think we also have to put the pressure on the state agencies that maintain highways for better lane markings and barrier maintenance. I really thing this whole thing could have been avoided with some paint.

There's been a lot of confusion in this thread about which lane he was traveling it. Even the I-TEAM investigation got it wrong in tonight's broadcast. I created this image to show my hypothesis of what happened. He would have been traveling in the same lane here as the Streetview car. I think it's pretty easy to see how the variance in the quality of the lane lines, plus the cracks/lines on the road may have caused the wrong lane line to be tracked by the cameras leading the car straight into the barrier (maybe a little left of center).

tesla_crash_lines.png
 
Last edited:
Definitely a tragic event for his family. Unfortunately, the reality with autopilot is that you are 100% responsible for watching where the car is going and taking control as needed, end of story. Personally, I prefer the mechanism that Super Cruise uses to guage the driver’s attention. Eye tracking is superior to steering wheel resistance if the goal is to make sure you’re watching the road.
 
Does anyone know if AP2 includes a driver facing camera yet?
It's in the Model 3, but isn't actively used right now.

I imagine that's going to change really quickly.

Right now Tesla is writing narratives based on very thin evidence if any. With the interior facing camera it's much easier for them to see what the driver was doing. Just like what we saw in the Uber case.
 
  • Like
Reactions: MXWing
How is giving the impression that the driver was warned right before the crash and didn't put his hands on the wheel not deception and straight up lies?

Literrally here is USA Today article "Tesla said Autopilot's adaptive cruise control was in the minimum following distance setting and that the driver had received several visual and one audible hands-on warning as reminders to keep his hands on the wheel prior to crashing into the center divider. "

Here is Jalopink Title "Tesla Says Autopilot Was On Before Fatal Model X Crash, But That Driver Didn’t Abide Warnings"

Job complete, because journalism today is bottom of the barrel, journalists wouldn't even use their brain and write half hearted articles leading to tesla narrative spreading.

Think about it. if i drove on AP for 15 minutes and i was alerted to put my hands on the wheel 2 minutes in. It would fit tesla's specially crafted statement.

It is rather sickening that, even in light of it being discovered AP drove the X straight into the barrier, Tesla is still trying to spin the narrative so that it doesn't look as bad...and push fault as much as possible on the deceased driver. Truly sickening.
 
  • Like
Reactions: cwerdna and Matias
Tesla didn’t lie, but they did craft the wording of that post to get the headlines that resulted. Very clever of them really.

How was the fact that he had gotten warnings at some point earlier in the drive (no timeline was given for that part) or that the follow distance was set to 1 relevant to the 6 critical seconds? I get warnings fairly often when my hands are on the wheel because I am holding it too lightly, I think a lot of folks here experience that. And that follow distance mention - they don’t explain that. Was the car following another vehicle? They say he had a clear view for 5 seconds, so the answer appears to be no. So, the only reason to mention that is to paint a narrative about the driver.

TL;DR: Tesla's shameless gamesmanship dishonors the dead.
 
  • Love
Reactions: phil0909
Tesla's statement really, really bothers me. Let's examine its pieces:

> In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged
Ok, so far so good.

> with the adaptive cruise control follow-distance set to minimum.
Irrelevant. The X didn't rear-end a car. It certainly wasn't following the concrete barrier. Based on a later sentence in this paragraph, the driver had an "unobstructed" view so wasn't following too closely.

> The driver had received several visual and one audible hands-on warning earlier in the drive
Irrelevant and intentionally misleading. I drive on AP2 often, and it is very difficult to complete a drive without some sort of hands-on warning going off, even when my hand is constantly on the wheel. Also, "earlier in the drive" is irrelevant to the accident and designed to mislead as it was almost certainly long before the accident.

> and the driver’s hands were not detected on the wheel for six seconds prior to the collision.
The hands-on warning goes off *all the time* on my car because I often don't exert enough torque on the steering wheel to trigger the detection. I am sure my median "hands-off" time, as detected by the car, is way over six seconds, despite almost always touching the wheel. Further, the fact that the driver's hands were on the wheel six seconds prior to the collision strongly suggests that the warnings were well before this point, and certainly not going off at the time of impact.

> The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.
Unfortunate all around. Driver was probably unattentive, but also, it's pretty clear Autopilot was not functioning as one would hope in this instance.
 
Tesla's post tonight confirms my feeling all along that Autopilot led Walter's X straight into the barrier. I think Tesla's statement was carefully crafted to reveal the facts while also being intentionally vague about the timing of warnings. I think it's likely that Walter was distracted and never saw it coming and wasn't warned in the seconds leading up to the crash.

I'm not posting here to point blame at Tesla or AP. Walter was a huge fan of Autopilot, but I believe he understood its limitations. Knowing Walter, I think if he could chime in here on this thread he would accept responsibility for this tragic accident.

The best thing that can come out of this now is that we learn something and move forward. I'm still very excited about the future of autonomous driving and I know that Walter would be proud if he were able to contribute to that future in a positive way.

It's remarkable to me that Tesla was able to retrieve the logs from the car. I'm sure that they have a team digging deep into that data already. I think we also have to put the pressure on the state agencies that maintain highways for better lane markings and barrier maintenance. I really thing this whole thing could have been avoided with some paint.

There's been a lot of confusion in this thread about which lane he was traveling it. Even the I-TEAM investigation got it wrong in tonight's broadcast. I created this image to show my hypothesis of what happened. He would have been traveling in the same lane here as the Streetview car. I think it's pretty easy to see how the variance in the quality of the lane lines, plus the cracks/lines on the road may have caused the wrong lane line to be tracked by the cameras leading the car straight into the barrier (maybe a little left of center).

tesla_crash_lines.png

Does Autosteer TURNING RADIUS allow you to go from that street view lane into the barrier?

Unless it was a hard *whip* an attentive driver would know from experience the autosteer was "loosing control".

Edit: I know when playing around with AP on residential streets going into homes, my model X freaks out and goes red GRAB WHEEL when it doesn't see what it likes.