Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
If computers aren't there yet to be reliable why would you want them to catch your mistakes. At this point in time I think that humans, if aware and essentially in control still, are the better back up--better at making that judgment call on barriers, bicyclists, pedestrians that are in the street and stationary for some reason.
 
I think Tesla needs to re-think their whole autopilot strategy. Until autopilot system can reach full Level 5 autonomy and covers all extreme corner cases (10 years down the road?), Level 2/3/4 autopilot system should NEVER be the primary driver and human is the secondary driver to "catch" any error by the computer in case something goes wrong. Human reaction time is simply way too slow and unreliable as the failure "back-up" system. I know Tesla keep claiming autopilot is an assistant and human driver need full attention or whatever statement they make for legal protection purpose, the reality is that a system that allow full "hands off" (and legs off) driving encourage human to be complacent and not pay full attention, no matter how you dice it.

Not to mention that Level 2/3/4 autopilot system has WAY too much "grey area" in terms on when the system would work, and when it won't. When the system only works 97% of the time for example, and 3% of the corner cases won't work. How does the human driver know when the autopilot would fail them? Unless the human driver pay 100 percent full attention to the road (in which the system encourage human to NOT pay attention). This kind of autopilot crash accident will continue to happen as long as this "grey area" exists.

I think other automaker such as Toyota and Lexus are making the right calls in terms of autonomy and driver assist system. Their latest cars rely on human driver to be the primary driver, and in case something goes wrong, the failure back up system is the computer (such as Lane keep assist with steering assistance when driver fail to sleep). Computer as the one to catch human any errors is way more reliable and faster in terms of reaction time. Also, many automaker intentionally leaves out auto-steering features in their cars so that driver know that they are the one that is the primary driver here, and require full attention.

Glad the Wright Brother's didn't listen to you, or we would still be using coal powered steam engines to traverse the United States.

First person on Mars isn't going to listen to you either.

Between Challenger and Columbia you had $20 billion into those shuttles, lives of Astronauts and the undoubtedly our very best scientific minds and engineers supporting the missions to "guarantee" nothing goes wrong.

"Anything" can and will happen.

The amount of panic is unreal.

Who Dares Wins...
 
Last edited:
OMG... Tesla's update is monumentally concerning... Absolutely monumentally concerning. That should NOT HAPPEN. I immediately retract any previous statements and my confidence in AP is now considerably shaken. How in the hell can that happen? That simply can't happen and if it can, then it's time for Tesla to disable AP in ALL CARS until they can guarantee it won't happen again. I repeat, this Tesla "fanboy" has seen enough. This is disturbing... I love AP, I use it all the time and life would suck without it but this is unconscionable....

Flame away if you will...

I so monumentally disappointed in Tesla and Elon Musk right now, completely and utterly disgusted.

Jeff

EDIT: To be clear because I know I'm going to get piled on for my emotional reaction to the Tesla update, I don't expect any support for my stance and I fully understand where each and every rational and reasonable counterargument would come from. I really do get it.


I agree, AP should be disabled for now. Tesla can’t make Autopilot work well in the f***ing Bay Area? That’s near the factory, C’mon!
 
I only set it to 1 or 2 when stuck in traffic, but I could see how one might be forget to switch it back higher afterwards. I hope Tesla will release more details as they review the logs. I certainly hope it wasn't a case where AP just suddenly steered the car into the divider vs the driver not seeing that AP was following the wrong lane.

This... a setting of 1 in 70mph+ rush hour traffic is too tight.
 
I think Tesla needs to re-think their whole autopilot strategy. Until autopilot system can reach full Level 5 autonomy and covers all extreme corner cases (10 years down the road?), Level 2/3/4 autopilot system should NEVER be the primary driver and human is the secondary driver to "catch" any error by the computer in case something goes wrong. Human reaction time is simply way too slow and unreliable as the failure "back-up" system. I know Tesla keep claiming autopilot is an assistant system and human driver need full attention or whatever statement they make for legal protection purpose, the reality is that a system that allow full "hands off" (and legs off) driving encourage human to be complacent and not pay full attention, no matter how you dice it.

Not to mention that Level 2/3/4 autopilot system has WAY too much "grey area" in terms on when the system would work, and when it won't. When the system only works 97% of the time for example, and 3% of the corner cases won't work. How does the human driver know when the autopilot would fail them in that 3% cases? Is Tesla assuming all driver are engineer and constantly analyzing and predicting when the computer system would fail them as they are doing their "hands off" driving? Unless the human driver pay 100 percent full attention to the road (in which the current autopilot system encourage human to NOT pay attention). This kind of autopilot crash accident will continue to happen as long as this "grey area" exists.

I think other automaker such as Toyota and Lexus are making the right calls in terms of autonomy and driver assist system. Their latest cars rely on human driver to be the primary driver, and in case something goes wrong, the failure back up system is the computer (such as Lane keep assist with steering assistance when driver fail to sleep). Computer as the one to catch human any errors is way more reliable and faster in terms of reaction time. Also, many automaker intentionally leaves out auto-steering cruise features in their cars so that driver knows they are the primary driver here, and require full attention.

This makes a lot of sense. Being Beta Bros for Tesla autopilot is a bad way to leave the planet.
 
  • Funny
Reactions: Icer
Glad the Wright Brother's didn't listen to you, or we would still be using coal powered steam engines to traverse the United States.

First person on Mars isn't going to listen to you either.

Between Challenger and Columbia you had $20 billion into those shuttles, lives of Astronauts and the undoubtedly our very best scientific minds and engineers supporting the missions to "guarantee" nothing goes wrong.

"Anything" can and will happen.

The amount of panic is unreal.

Who Dares Wins...


Car buyers are not signing up to be the Wright brothers.
 
If computers aren't there yet to be reliable why would you want them to catch your mistakes. At this point in time I think that humans, if aware and essentially in control still, are the better back up--better at making that judgment call on barriers, bicyclists, pedestrians that are in the street and stationary for some reason.
That's assuming if the driver continue to pay full attention when using the current autopilot system. But like I said before, many driver won't pay full attention because hands off and legs off driving encourage driver NOT to pay attention.
 
  • Like
Reactions: Snobun
That's assuming if the driver continue to pay full attention when using the current autopilot system. But like I said before, many driver won't pay full attention because hands off and legs off driving encourage driver NOT to pay attention.

You can't project your negligence and inability to properly use a tool to those who can.

This is why no one can have nice things and why we call it the 'rise of the nanny state'.
 
  • Like
Reactions: Sudre and Snobun
The car is a 100D, the chance of getting one as CPO and/or AP1 is slim. The window is narrow, ie. AP2 went into production 2~3 months after 100D did if memory serves me right.

The timeframe was tight, yeah - but was it not the other way around? 100Ds (not to be confused with P100Ds, which were available sooner) started to be available in March 2017. AP2 came about a few months prior in late 2016 after the event/announcement just before the bogus 12/2016 video.

The reporter in question and media in general wouldn’t know the difference between a CPO Model X or a new Model X. He referred to Mr. Huang’s communication about AP as “told the *dealer*”. So..... not the most accurate reporter on the planet.

That said, I doubt a new buyer would know enough to seek out an AP1 car. After all - it takes experience with AP1 and then some time down the rabbit hole with AP2 to really know that you want the AP1 car back. :)

Attempt at humor aside, I hope the NTSB or somebody can scare up some video of the seconds leading up to impact. I’m a big fan of Occam’s Razor, but there are a lot of distractions on the road, too. For example, the sun here can be so bright (for example, the 91W and 91E right at rush hour) it’s tough to see anything sometimes - signs, markings, merges, the works.
 
  • Like
Reactions: Kant.Ing
Thank you for the link. However, if this is the case, who really thinks FSD can be reached with this? Seems absolutely impossible for me if they still cannot detect stationary objects.

I also would really like to know about a real statistic research, not compared to average accidents, but compared to the same car class, drivers age, country,... I assume the use of the Tesla systems will make accidents more likely if compared statistically valid.
 
Guess I will never do this again either... my record of keeping my eyes shut while on AP1 was 15 seconds...

EDIT: Of course there was no obvious moving objects when I did so.

More relevant, what about stationary objects? Unless you were out in the desert, still pretty reckless.

That said, I'm definitely guilty of stupid stunts for a while after AP1 came out, but not anymore. It makes me sad that we're not as close to living in the awesome future I dreamed of as a kid as I thought we were.
 
Thank you for the link. However, if this is the case, who really thinks FSD can be reached with this? Seems absolutely impossible for me if they still cannot detect stationary objects.

I also would really like to know about a real statistic research, not compared to average accidents, but compared to the same car class, drivers age, country,... I assume the use of the Tesla systems will make accidents more likely if compared statistically valid.

How it will be achieved is through 'image recognition'. Tesla's aim is to replicate human vision as the sensors and the neural net to make decisions.

This next part is pure conjecture: but I believe Tesla should be leveraging the SpaceX low orbit internet satellites to set up a mesh communication network between cars to further improve data analysis and decision making of the neutral net.

Much much much harder than Lidar implementations but more scale-able if you succeed. Not everyone wants a Waymo car.

Ghostbusters-Ecto-1-6.jpg
 
How it will be achieved is through 'image recognition'. Tesla's aim is to replicate human vision as the sensors and the neural net to make decisions.

This is just a dream and probably impossible. Still today even speech recognition is far below that what a human can do. And this should be much easier to solve via neural nets.

And here we are talking about safety relevant functions. Such only be implemented if the input sensors are robust and fail safe (typically redundant). Check how tough an autoland for planes is. This is just for a straight runway!
Then after the sensors, the software implementation plays into it. There are so many different conditions, situations and unforeseeable things, software will never be able to reach the same outcome than humans.
 
  • Like
Reactions: Yoonoo
Not to mention that Level 2/3/4 autopilot system has WAY too much "grey area" in terms on when the system would work, and when it won't. When the system only works 97% of the time for example, and 3% of the corner cases won't work. How does the human driver know when the autopilot would fail them in that 3% cases? Is Tesla assuming all driver are engineer and constantly analyzing and predicting when the computer system would fail them as they are doing their "hands off" driving? Unless the human driver pay 100 percent full attention to the road (in which the current autopilot system encourage human to NOT pay attention). This kind of autopilot crash accident will continue to happen as long as this "grey area" exists.
But yet AP is proven to be safer than a human driver. So you are saying it is better to take an all or nothing approach and ignore incremental reduced occurrences of fatalities per miles driven. 40% safer is better than 0%. Please provide data that proves level 2/3/4 autonomous systems are actually more dangerous.
 
  • Like
Reactions: Sudre
What I'm most disturbed about is what ever became of the high definition maps?

Tesla blogs about how many drivers went through that section just fine on AP. But, somehow in this one instance it suddenly screws up. Sure I can understand how the vision would be confused. But, in that state the mapped data is there for the car to follow.

We've seen videos of Tesla on AP driving on snow covered roads. So obviously it's capable of this.

This is probably singlehandedly my biggest frustration with AP. It's not that one section of road will give it a hard time, but it's the inconsistency. Half this forum is talk about either AP improving or regressing. But, who knows if it did anything because it's so inconsistent.

In this tragic accident:

We're not talking about a road that's barely driven over by Tesla vehicles
We're not talking about bad weather or anything particularly out of the ordinary. I'm not aware of any GPS degrading/interference that day (there have been some recently though due to military exercises).

This is a section of road traveled over hundreds of thousands of times with a Tesla either on AP or off, but still collecting data.

This is the heart of Tesla country, and it couldn't navigate something that's been just like it is for years.
 
Last edited: