Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Interesting: Missing warning poles?

Screen Shot 2018-03-31 at 1.58.07 PM.png


Interesting: Two tone highway makes a line that leads to "lane" that goes to barrier. Not a good place to not be paying attention while using a lane following technology. Note: Bad lines are not solely limited to just Tesla since many cars are starting to have driver assistance tech that follows lines.

Screen Shot 2018-03-31 at 1.58.40 PM.png


Bad road maintenance + tricky area with lots of "lines" + missing safety objects (cones and collapsible barrier) + driver not paying attention = high odds accident might happen


With many cars starting to have line following driver tech:
* Better line painting tech is needed that doesn't fade so quickly
* Better line standards need to be in place so gaps, fades, missing don't happen.
* Different pavement types making lines need to be considered as a "thing" that may make lines where you don't want them

In general:
* Safety barriers need to be replaced right after accident takes them out
* All drivers need to always pay attention regardless of tech assistance.

In summary: This was a very bad place to have assistance engaged and not paying attention.
 
Last edited:
I am totally fine (while I may not agree) with a rational argument that there needs to be more clear education and expectations setting around AP. That said, not sure what you mean by my 6 second logic. If what you are saying is that it bad things can happen much faster than that, I agree with that too.

I am even fine with those who attribute some amount of blame on Tesla if they can show that indeed there is not enough warnings or expectation setting. I imagine a court will decide that.

I am not so fine with those calling for abolishment of AP (which is safer than non-AP overall), or major changes to AP -- and yeah, that is probably my agenda, because I get so much utility out of it. It is also clear that some here (not the poster I am responding too) are just on an irrational tirade of anger or justification of their prior points of view (before the data emerged). Most on this board though, have been very helpful and thoughtful.

Thousands of people die every year in auto accidents, it is not possible to make changes to stop all of those deaths, so the argument that "one death is too many" is emotionally true, but pragmatically unactionable. (again, from other posts, not the one I am responding to)
A lot of people said it couldn’t be auto-pilot, but it turned out to be, so don’t say it’s irrational. Yes, wait for the data and investigation to finish. So bear in mind that it couldn’t go either way and so don’t defend so avidly. If you only have half the facts then why are you calling someone irrational.
 
+1000

Interesting: Missing warning poles?

View attachment 290757

Interesting: Two tone highway makes a line that leads to "lane" that goes to barrier. Not a good place to not be paying attention while using a lane following technology. Note: Bad lines are not solely limited to just Tesla since many cars are starting to have driver assistance tech that follows lines.

View attachment 290758

Bad road maintenance + tricky area with lots of "lines" + missing safety objects (cones and collapsible barrier) + driver not paying attention = high odds accident might happen


With many cars starting to have line following driver tech:
* Better line painting tech is needed that doesn't fade so quickly
* Better line standards need to be in place so gaps, fades, missing don't happen.
* Different pavement types making lines need to be considered as a "thing" that may make lines where you don't want them

In general:
* Safety barriers need to be replaced right after accident takes them out
* All drivers need to always pay attention regardless of tech assistance.
 
  • Helpful
Reactions: Nilnoc
Thanks for the link, didn't know that document. However, they have reached this under laboratory conditions so this worth nothing under real world circumstances. And again, this is an easy job compared to vision. And much more difficult is understanding as they also wrote. So I think my thesis that autonomous driving will not be solved by AI is still valid.

Plus the test is of transcription, and so there is nuances of language that the machine can't interpret. In any case it's pretty darn impressive, and it's likely going to be helped out by just how many amazon echo units are in the field.

Autonomous driving is not a problem solved through AI alone.

It's an automation problem that can be tackled from numerous angles. Only a couple of those angles are tackled by AI.
 
  • Like
Reactions: Esme Es Mejor
Also I recently read a news article that approximately 49 percent of drivers have admitted to using a cell phone while driving. That is a staggering figure and if even close to being accurate, means many more of us are in danger on the highways everyday.

If that's the case, then another 20%+ of survey respondents were lying. The use of cellphones while driving in the US (at least in Texas) is extremely common, and in most cities, not illegal. The fact there's not a federal (national) law to restrict cellphone use while moving is ridiculous.Coming from a country where it is illegal to use a phone while driving, and where the fines are significant and frequently imposed, changing the law is the first step to changing habits, because people are addicted to cellphones. That's not going to change, so laws and technology need to.

but, we have no idea whether that's a factor here, either for the Tesla driver, or him taking action to avoid another driver who was drifting.
 
Ok, so I was wrong about AP not being active, I suck.

Now we need to modify the theory of what happened. We are now assuming that Walter thought he was in the adjacent HOV Lane? The one that would have taken him to the 101? I’m still confused as to how the car goes straight at that point. With a lead vehicle cutting him off an obscuring Lane lines for a moment? What’s the working theory now?


My new theory is he was in the 101 lane at the beginning of the split, and AP locked onto the outside wrong white line as the correct line looks to be in worse shape paint-wise. It is possible he had 10.4 with the new wide-lane feature that contributed to AP being able to lock onto that wrong lane line. Once locked onto the wrong line, AP aligned him into the gore area and drove into the barrier. Tesla stated he had 5 seconds of unobstructed view of the barrier, so, unfortunately he probably wasn’t looking at the road or was blinded by the sun.
 
...the fire truck, etc...

It was claimed that the fire truck was hidden by the pickup truck in front and it was too late to react by the time the pickup truck in front avoided the fire truck so the driver of Tesla could now see an exposed fire truck.

In this case, it's not that the driver was not looking at the road:

"The driver of the Tesla is my dad's friend. He said that he was behind a pickup truck with AP engaged. The pickup truck suddenly swerved into the right lane because of the firetruck parked ahead. Because the pickup truck was too high to see over, he didn't have enough time to react. He hit the firetruck at 65mph and the steering column was pushed 2 feet inwards toward him. Luckily, he wasn't hurt. He fully acknowledges that he should've been paying more attention and isn't blaming Tesla. The whole thing was pretty unfortunate considering he bought the car fairly recently (blacked it out too)."

No current driver restriction schemes is perfect. Reporters who test-drove the Cadillac CT6 Super Cruise were able to learn the behavior of its eye scanning technique and was able to do social media/e-mails while being monitored with eye scanning device.

" We worked out ways to bounce our attention back and forth between our phone and the car in roughly four-second segments, deceiving the system into thinking we were paying attention, to the point where we were all able to answer emails and to Slack with co-workers. One journalist I was riding with was able to communicate with his editor, log in to Facebook, and then start a Facebook Live session of him driving hands-free. While I held the camera for the majority of the time, the fact he was able to coordinate that many moving parts while ostensibly driving a car is either impressive or incredibly stupid."
 
This accident hit home for me because for these reasons.
  • I went past the accident site 10-15 min after it happened on the other side of 101. Saw the scene with my own eyes.
  • I ordered my model x few days back, due in June.
  • This route is my everyday commute
  • I was planning to use AP to ease my 1 hr commute.
When I test drove the model x, I already felt this is no where close to what I was thinking.
My first reaction looking at the damage was, this must be autopilot because it is very unusual for somebody to drive into the lane divider at freeway speed and cause that kind of damage

I don’t think I’ll use AP in outside of stop and go traffic. Certainly not above 35-40 mph. And never with family in the car. I think Tesla should restrict the the autopilot on high speeds.
 
My new theory is he was in the 101 lane at the beginning of the split, and AP locked onto the outside wrong white line as the correct line looks to be in worse shape paint-wise. It is possible he had 10.4 with the new wide-lane feature that contributed to AP being able to lock onto that wrong lane line. Once locked onto the wrong line, AP aligned him into the gore area and drove into the barrier. Tesla stated he had 5 seconds of unobstructed view of the barrier, so, unfortunately he probably wasn’t looking at the road or was blinded by the sun.

Our Tesla with AP2 had an affinity for these types of barriers long before 10.4. I've never owned an AP1 Tesla, but we do own a Nissan Rogue with Mobileye which seems to work just about like AP1, and the Rogue has never had a bromance with lane barriers. So... it's definitely a Tesla thing.
 
Last edited:
  • Informative
Reactions: Az_Rael
Now I feel better about buying my Kia. First the massive recall of over 100K cars and now this. Maybe I'll buy one in a few years when they've had time to reflect and understand how to build these cars safely.
This post adds nothing to this thread, you're judging an entire company on one incident, and conflating it irrationally. The recall is voluntary, and not 'massive'...wasn't the last GM recall for 700,000 cars, and people had died as a result of the steering column locking? And they knew about it for years? I guess Kia doesn't have autopilot; you don't have to select the option on a Tesla either. Tesla's are still universally recognized as some of the safest cars in the world.
 
I am totally fine (while I may not agree) with a rational argument that there needs to be more clear education and expectations setting around AP. That said, not sure what you mean by my 6 second logic. If what you are saying is that it bad things can happen much faster than that, I agree with that too.


Thousands of people die every year in auto accidents, it is not possible to make changes to stop all of those deaths, so the argument that "one death is too many" is emotionally true, but pragmatically unactionable. (again, from other posts, not the one I am responding to)

They keep touting their safety record vs the standard. But that standard includes drunk drivers and teenagers. What is the deadly accident rate for drivers with AGI 200k and up?
 
Now I feel better about buying my Kia. First the massive recall of over 100K cars and now this. Maybe I'll buy one in a few years when they've had time to reflect and understand how to build these cars safely.
You do know the recall is about a part on a steering assembly that Bosch makes for Tesla right?

That they're paying for the recall.

Bosch has been in business for a long time, and they make really good stuff.
 
  • Like
Reactions: X Fan and NerdUno
It was claimed that the fire truck was hidden by the pickup truck in front and it was too late to react by the time the pickup truck in front avoided the fire truck so the driver of Tesla could now see an exposed fire truck.

In this case, it's not that the driver was not looking at the road:

"The driver of the Tesla is my dad's friend. He said that he was behind a pickup truck with AP engaged. The pickup truck suddenly swerved into the right lane because of the firetruck parked ahead. Because the pickup truck was too high to see over, he didn't have enough time to react. He hit the firetruck at 65mph and the steering column was pushed 2 feet inwards toward him. Luckily, he wasn't hurt. He fully acknowledges that he should've been paying more attention and isn't blaming Tesla. The whole thing was pretty unfortunate considering he bought the car fairly recently (blacked it out too)."

No current driver restriction schemes is perfect. Reporters who test-drove the Cadillac CT6 Super Cruise were able to learn the behavior of its eye scanning technique and was able to do social media/e-mails while being monitored with eye scanning device.

" We worked out ways to bounce our attention back and forth between our phone and the car in roughly four-second segments, deceiving the system into thinking we were paying attention, to the point where we were all able to answer emails and to Slack with co-workers. One journalist I was riding with was able to communicate with his editor, log in to Facebook, and then start a Facebook Live session of him driving hands-free. While I held the camera for the majority of the time, the fact he was able to coordinate that many moving parts while ostensibly driving a car is either impressive or incredibly stupid."
No system is perfect. I'd still argue that a system that requires you to be looking forward every 4 seconds is better than one you can jam an orange/water bottle into the steering wheel or your knee against the bottom and close your eyes. In Tesla's defense, I'd say they're both preferable to a car with no insight into driver awareness at all, since we all know that many people don't let a little thing like driving a car interfere with texting and reading.

In the case of the fire truck, perhaps it fell into the overlap where it was unavoidable by the driver and beyond the abilities of the car to detect. In any case, it doesn't invalidate the ability of the Tesla safety systems to benefit in numerous other instances. If every car on the road were forced to go autonomous today and they killed ~18,000 people per year, it would still be a huge improvement.

I'm happy with the Tesla autopilot today. It's consistently improving and I look forward to the point where it advances to match it's full advertised capabilities. I'll continue to treat it with the level of attention that was given to my 16 year old behind the wheel with their permit.
 
  • Like
Reactions: e-FTW