Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
We have an Outback in addition to the Tesla and it is awesome. From my understanding / experience with it, a Subaru with eyesight would not have hit that barrier. I once got a warning about a huge pothole with the Subaru. The Tesla has never warned me about any such obstacle.


Did you actually watch that video? You still think it's guaranteed to have worked? Maybe watch the whole thing first.
 
That is a great article, and I hope people read it.

It clearly lays out why Tesla shouldn't be releasing information and the importance for our safety of a neutral party decoding of car systems (regardless of whether it is Honda or Tesla or BMW).
I wouldn't exactly call it a great article. They incorrectly claim that Tesla said the driver didn't have his hands on the wheel for six seconds prior to the crash. Tesla said his hands were not detected on the wheel for six seconds prior to impact.

If Tesla was obligated to refrain from releasing the information, then it was wrong of them to do so. Nobody else seems to be refraining from speculation about the accident though, so I don't see anything inherently wrong with them releasing factual information. I wish they'd worded it better though, since so many people seem to be misinterpreting it.

I'm not sure a government organization being "unhappy" with you is the right metric for your compliance with the law.
 
A Subaru with eye sight very likely would have come to a screeching halt at the highway divider. That’s a fact.
I do think Tesla's system would be improved with stereoscopic vision in the absence of LIDAR. I'm hoping that the side cameras will eventually enable some level of visual depth perception, but they're obviously incapable of seeing what's directly in front of the car.

I don't think the 3 forward facing cameras are far enough apart to get a stereo view, especially with them having different fields of view.
 
  • Informative
Reactions: NeverFollow
A Subaru with eye sight very likely would have come to a screeching halt at the highway divider. That’s a fact.
According to Subaru.

*2 Pre-collision Braking System does not work when the speed difference from a leading car is more than 50km/h and from a pedestrian is more than 35km/h. Other weather and external conditions may also prevent Pre-collision Braking from working, even when the speed difference from a leading car is 50km/h or less and from a pedestrian is 35km/h or less.

I'm not sure Eyesight would have prevented a 70mph collision with an obstacle potentially smaller than a pedestrian.
 
  • Informative
Reactions: MP3Mike
Maybe one of the AP enthusiasts already did so last Friday. :(

Looks like someone took the offer for the April Fool's Day and I wonder how can the driver still be alive to tell about the gore point testing!

The car was on autopilot running at 59 MPH on lane number 3 (from left to right).

Lane 1 and 2 are HOV (High-occupancy vehicle lane.)

Lane 3 and the right one, lane 4 are regular ones.

Auto pilot should keep on lane 3, the regular lane but it decided that a gore point as another lane between lane 2 and 3.

Autopilot then ran straight right into an appropriate gore point full dressed with Chevron markings!

I said since Autopilot is beta so anything is possible but still, I don't want any one to die!


BxdfPNh.png


Here's the 35 second clip:

 
Last edited:
I wouldn't exactly call it a great article. They incorrectly claim that Tesla said the driver didn't have his hands on the wheel for six seconds prior to the crash. Tesla said his hands were not detected on the wheel for six seconds prior to impact.

If Tesla was obligated to refrain from releasing the information, then it was wrong of them to do so. Nobody else seems to be refraining from speculation about the accident though, so I don't see anything inherently wrong with them releasing factual information. I wish they'd worded it better though, since so many people seem to be misinterpreting it.

I'm not sure a government organization being "unhappy" with you is the right metric for your compliance with the law.
I admit my eyes glazed right over that error. Sorry. I guess I've read it so many times now that I've just given up on it -- which is of course to say that Tesla accomplished part of their goal with their "factual information". Which brings me to....

... I stand by the two things from the article that I did mention. Anytime a party with an obvious interest preemptively releases a statement as duplicitous and incomplete as Tesla's (and completely independently unverifiable!), it is done to confuse and to taint opinion. Tesla's release was silky-smooth and light years ahead of the board's ability to present their findings. I'm too tired to make a joke about "expect".
 
  • Like
Reactions: croman and Icer
I wouldn't exactly call it a great article. They incorrectly claim that Tesla said the driver didn't have his hands on the wheel for six seconds prior to the crash. Tesla said his hands were not detected on the wheel for six seconds prior to impact.

If Tesla was obligated to refrain from releasing the information, then it was wrong of them to do so. Nobody else seems to be refraining from speculation about the accident though, so I don't see anything inherently wrong with them releasing factual information. I wish they'd worded it better though, since so many people seem to be misinterpreting it.

I'm not sure a government organization being "unhappy" with you is the right metric for your compliance with the law.

Normally safety boards and investigations require all parties participating channel all communications through the board only. This helps keep the investigation neutral without competing storylines coming out before the whole investigation is complete.

Here are the NTSB’s investigation manual: http://libraryonline.erau.edu/online-full-text/books-online/1181.3.pdf

FD61CF43-428A-46CF-B81D-97B931F68427.png
 
I experienced a similar scenario as the Tesla in this thread. Knowing AutoPilot might steer toward an exposed median, I was on high alert during the stretch of road with the median. I've used AutoPilot many times along the same highway, and it has occasionally drifted near the median.

Today, my 2016 X90D with AP2 (2018.12) steered directly towards the median. I normally let AutoPilot try to correct itself but didn't feel there was time. Luckily, I had both hands on the wheel ready to takeover and avoided hitting the median. My advice for every owner is to treat AutoPilot as a secondary safety device rather than the primary.

 
@BLKTSLA I was hoping you could add your input of what you think may have happened in this accident?

It's unfortunate what happened and condolences to the family of the driver. That said, I have a few theories in mind but the most pertinent one is the fact that Tesla confirmed that no warning went off in the moments before the accident, and that the car ignored the lane markings and somehow drove over them and into this divider. I'm not going to get too much into the "blame game" but ultimately its the drivers responsibility as to how AP is used, and its Tesla's responsibility to disallow AP from endangering its driver in common driving scenarios which this was.

It seems like the driver was on AP, and something either happened to them or they were not paying close enough attention and the car went off track and crashed with little or no time for the driver to respond. I can say that in my experience on AP, there are SOME normal cases where the car does indeed drift close to the yellow line in the left lane. Its is extremely rare but I cannot say it doesn't happen. Unfortunately someone had to lose their life but this is still a cautionary tale for drivers to be more vigilant and for Tesla to provide more rigor in their testing to further prevent something like this from happening again.

Another thing I would like to know is if this MX had AP2.x and 2018.10.4 which is where the greatly improved lane detection was rolled out. This could a be a case of too little too late in regards to the firmware. I can definitely see a faded lane marking effecting AP prior to 2018.10.4 on AP2.x hardware. However, with this new release the lane marking has to be almost completely gone before its "invisible" to AP and I cannot see this happening with the new firmware.
 
  • Like
Reactions: buttershrimp
Looks like someone took the offer for the April Fool's Day and I wonder how can the driver still be alive to tell about the gore point testing!

The car was on autopilot running at 59 MPH on lane number 3 (from left to right).

Lane 1 and 2 are HOV (High-occupancy vehicle lane.)

Lane 3 and the right one, lane 4 are regular ones.

Auto pilot should keep on lane 3, the regular lane but it decided that a gore point as another lane between lane 2 and 3.

Autopilot then ran straight right into an appropriate gore point full dressed with Chevron markings!

I said since Autopilot is beta so anything is possible but still, I don't want any one to die!


BxdfPNh.png


Here's the 35 second clip:


Holy crap, they were lucky they stopped in time!!!!
 
Holy crap, they were lucky they stopped in time!!!!
Great video showing how these things can happen, but still is amazing to me that this scenario looks as though there is absolutely no lane line.... In fact, I thought AP was choosing correctly up until the chevrons. The CA accident situation seems to be a situation where the lane lines were visible but that autopilot is tracking the pavement seams more. A really helpful video, but I'm hoping that someone does video of the tendencies of AP at the accident site, without endangering anyone of course.
 
I experienced a similar scenario as the Tesla in this thread. Knowing AutoPilot might steer toward an exposed median, I was on high alert during the stretch of road with the median. I've used AutoPilot many times along the same highway, and it has occasionally drifted near the median.

Today, my 2016 X90D with AP2 (2018.12) steered directly towards the median. I normally let AutoPilot try to correct itself but didn't feel there was time. Luckily, I had both hands on the wheel ready to takeover and avoided hitting the median. My advice for every owner is to treat AutoPilot as a secondary safety device rather than the primary.

Interesting, AP does this all the time to me where there is a literal fork in the road and it's not sure which to take.... I think this situation is not an apple to apple comparison to the crash.... though it is a helpful video nonetheless (really helpful), I think @Tam's video is more of a similar situation.... my guess in this particular video situation is that AP would have actually tried to go all the way to the left once it tried to choose to exit.
 
  • Informative
Reactions: ddkilzer