Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Maybe Tesla should word the disclaimer stronger with a more clear language such as "AP is currently in the experimental (a.ka. "Beta" stage). When using AP, the car WILL DO THINGS WHICH WILL RESULT IN AN ACCIDENT AND POSSIBLY DEATHS IF YOU DON'T TAKE OVER. It might do it once a week, it might do it once a year, or once a decade, but it will at some point kill you if you don't intervene, so you agree to be vigilant and have full situational awareness at all times when using this feature and be ready to take over at any time without any notice from the car. By using this feature you agree to assume all all liability for anything the car does whether AP is enabled or not, even if the AP action breaks traffic laws or does something completely irrational such as impale the car into a median, or drive into oncoming traffic, etc".

Have people read the more direct language, acknowledge they understand, then sign with a notary present - no more problems with lawsuits.
Starting to sounds like every advertised pill on tv - don't forget difficulty maintaining continence and yellowing of the eyes - LOL
 
As a comparison, given the much longer history, curious to hear from any airline pilots...if the AP in the plane failed to operate as expected would the NTSB fault/blame the AP system or the "pilot error" for not taking over?
There are several pilots of all kinds on here and sure to chime in. I've been flying a highly automated private plane for a long time, and flying for decades before there were such complex systems. Good question, but there is no binary answer to that question in my view. There are all kinds of possible aircraft automation failures. I guess most everyone heard about the most recent MCAS issue with Boeing. Pilots experimented with turning it off and on and taking over, but bottom line is it was still the fault of the MCAS. HOWEVER; Just keep in mind that AP systems (various kinds) on a plane and AP on a car are two completely different things and we really can't compare one to the other. Even IF an AP system failure occurs in a plane and NTSB proclaims it to be the main cause of a crash, we can't necessarily blame a car AP for a failure. And of course plane systems have been around for a LONG time. As we know, car AP is just in its birth.
 
There are several pilots of all kinds on here and sure to chime in. I've been flying a highly automated private plane for a long time, and flying for decades before there were such complex systems. Good question, but there is no binary answer to that question in my view. There are all kinds of possible aircraft automation failures. I guess most everyone heard about the most recent MCAS issue with Boeing. Pilots experimented with turning it off and on and taking over, but bottom line is it was still the fault of the MCAS. HOWEVER; Just keep in mind that AP systems (various kinds) on a plane and AP on a car are two completely different things and we really can't compare one to the other. Even IF an AP system failure occurs in a plane and NTSB proclaims it to be the main cause of a crash, we can't necessarily blame a car AP for a failure. And of course plane systems have been around for a LONG time. As we know, car AP is just in its birth.

Yep, I agree it’s a rough comparison. Just curious on how a more established tech in aircraft accidents is viewed. My feeling is that given the newness of Tesla AP, drivers should shoulder all the responsibility.
 
  • Love
Reactions: Silicon Desert
I don't know whether this have been addressed, so forgive me if it was.

The fact that someone got AP to misbehave on the first at the very site of the lethal accident, in another Tesla, is that not a sign of gross negligence by Tesla? How hard would it be for them to disable AP on that section? It's not like they have Tesla Network live already and car that need to get through without a designated driver at the wheel.
The accident sadly happened and then seems like there was zero action taken to prevent recurrence at that very spot.
The psychology and reasoning behind business policies merely resulting in this citizen video to be possible, I consider extremely dire. Where are the testing and active version control? It's not exactly "set and forget", now is it?
 
I don't know whether this have been addressed, so forgive me if it was.

The fact that someone got AP to misbehave on the first at the very site of the lethal accident, in another Tesla, is that not a sign of gross negligence by Tesla? How hard would it be for them to disable AP on that section? It's not like they have Tesla Network live already and car that need to get through without a designated driver at the wheel.
The accident sadly happened and then seems like there was zero action taken to prevent recurrence at that very spot.
The psychology and reasoning behind business policies merely resulting in this citizen video to be possible, I consider extremely dire. Where are the testing and active version control? It's not exactly "set and forget", now is it?

Based on statements by those that knew him, the driver who died had complained abut the behavior of AP at that very junction multiple times (complaints and occurrence). The one most in control over recurrence was the driver.
 
Based on statements by those that knew him, the driver who died had complained abut the behavior of AP at that very junction multiple times (complaints and occurrence). The one most in control over recurrence was the driver.
Yea, interesting. If it is true that he was aware of issues at that junction, then I guess in some cases it can work for him and work against him. I guess we will know the outcome in the near future since the family now has a law suit against Tesla and maybe that will go to court.

The accident sadly happened and then seems like there was zero action taken to prevent recurrence at that very spot.
It seems that way, yet I'm not so sure we can say there was zero action if you are referring to Tesla. They could have done something we don't know about. As for the highway dept, they didn't do much. There have been several accidents by all sorts of vehicles at that location, thus indicating somewhat that even drivers without AP had issues. Also to add that the Cal state dept of transportation is also named in the suit, and for good reason I just mentioned. They may be negligent. We share see. I hate to speculate.
 
Last edited:
There are several pilots of all kinds on here and sure to chime in. I've been flying a highly automated private plane for a long time, and flying for decades before there were such complex systems. Good question, but there is no binary answer to that question in my view. There are all kinds of possible aircraft automation failures. I guess most everyone heard about the most recent MCAS issue with Boeing. Pilots experimented with turning it off and on and taking over, but bottom line is it was still the fault of the MCAS. HOWEVER; Just keep in mind that AP systems (various kinds) on a plane and AP on a car are two completely different things and we really can't compare one to the other. Even IF an AP system failure occurs in a plane and NTSB proclaims it to be the main cause of a crash, we can't necessarily blame a car AP for a failure. And of course plane systems have been around for a LONG time. As we know, car AP is just in its birth.
I fly a handful of different airplanes all with different kinds autopilots. They ALL do weird unexpected things sometimes. Some I learn to deal with it because its a benign issue while others are just flat dangerous. One I fly can fly straight and level for months then will get into a runaway trim situation randomly. I avoid altitude hold and turn off auto trim on that one.
What I don't do is continue to trust it, even in places where I know it sucks, and let it kill me. That is just Darwinism at work.
To me, my Tesla's Autopilot is exactly like a general aviation autopilot. It is meant to reduce workload. It is not a mandatory piece of equipment like it is for the airliners. What the autopilot does in a general aviation airplane is 100% the responsibility of the pilot. Does what you don't want it to you disengage it and hand fly. If Tesla gets punished for events like this technology advancements in cars will nearly stop because manufactures will have to take responsibility for human error for anything new and the price of that older stuff will be prohibitively expensive. Lets not go there.

I'm not an attorney, but I think the driver's negligent use of Autopilot has harmed Tesla's reputation and believe Tesla would be in the right to sue the driver's estate in accordance.
 
I fly a handful of different airplanes all with different kinds autopilots. They ALL do weird unexpected things sometimes. Some I learn to deal with it because its a benign issue while others are just flat dangerous. One I fly can fly straight and level for months then will get into a runaway trim situation randomly. I avoid altitude hold and turn off auto trim on that one.
What I don't do is continue to trust it, even in places where I know it sucks, and let it kill me. That is just Darwinism at work.
To me, my Tesla's Autopilot is exactly like a general aviation autopilot. It is meant to reduce workload. It is not a mandatory piece of equipment like it is for the airliners. What the autopilot does in a general aviation airplane is 100% the responsibility of the pilot. Does what you don't want it to you disengage it and hand fly. If Tesla gets punished for events like this technology advancements in cars will nearly stop because manufactures will have to take responsibility for human error for anything new and the price of that older stuff will be prohibitively expensive. Lets not go there.

I'm not an attorney, but I think the driver's negligent use of Autopilot has harmed Tesla's reputation and believe Tesla would be in the right to sue the driver's estate in accordance.
AMEN on all that !!! oh but the wife thinks that last sentence won't ever happen :)
 
  • Like
Reactions: golfpilot
I don't know whether this have been addressed, so forgive me if it was.

The fact that someone got AP to misbehave on the first at the very site of the lethal accident, in another Tesla, is that not a sign of gross negligence by Tesla? How hard would it be for them to disable AP on that section? It's not like they have Tesla Network live already and car that need to get through without a designated driver at the wheel.
The accident sadly happened and then seems like there was zero action taken to prevent recurrence at that very spot.
The psychology and reasoning behind business policies merely resulting in this citizen video to be possible, I consider extremely dire. Where are the testing and active version control? It's not exactly "set and forget", now is it?
I understand where you are coming from but I don't think Tesla is going to disable autopilot for a certain section of road. Sure it's easy but they'd probably have too many complaints that it's not working because people don't understand the situation

They should probably look into making restricted zones for AP though such as this place. Especially HW-17 from Santa Cruz. There's flipped cars and wrecks at least once a week some of which have been Teslas. Horrible place to consider using it yet its still available
 
  • Informative
Reactions: FlatSix911
I understand where you are coming from but I don't think Tesla is going to disable autopilot for a certain section of road. Sure it's easy but they'd probably have too many complaints that it's not working because people don't understand the situation

They should probably look into making restricted zones for AP though such as this place. Especially HW-17 from Santa Cruz. There's flipped cars and wrecks at least once a week some of which have been Teslas. Horrible place to consider using it yet its still available
Seriously? They'd only find out on the 2nd or more time passing the section.
I see LOTS of Tesla videos and often enough AP will disengage for no apparent reason. It's not a door to door service, when uncomfortable, AP is supposed to disengage.
A lethal accident, possibly related to AP and a specific lane marking issue (or any reason), how can that NOT be reason to disable it for the time being?
In what kind of a world do we live that it some is NOT a no-brainer to prevent a possible recurrence error? Any business I've worked in, I addressed this. And this was a life that was lost. Due to this very service according to the lawsuit by the family.

Bizarro world, I have no words..
 
Seriously? They'd only find out on the 2nd or more time passing the section.
I see LOTS of Tesla videos and often enough AP will disengage for no apparent reason. It's not a door to door service, when uncomfortable, AP is supposed to disengage.
A lethal accident, possibly related to AP and a specific lane marking issue (or any reason), how can that NOT be reason to disable it for the time being?
In what kind of a world do we live that it some is NOT a no-brainer to prevent a possible recurrence error? Any business I've worked in, I addressed this. And this was a life that was lost. Due to this very service according to the lawsuit by the family.

Bizarro world, I have no words..

The driver chose to use AP at that junction, even after complaining to people about its performance at that junction multiple times.
So, if not using it was a no-brainer...
 
It seems that way, yet I'm not so sure we can say there was zero action if you are referring to Tesla. They could have done something we don't know about. As for the highway dept, they didn't do much.
Does AP existing now for the Highway Dept to take care of specific stretches of road that are OK for trained eyes, but hazards to untrained AI?

The video could be made and I doubt it was the same day. If I were working in any capacity on the AP software, I'd need a good pep talk to stay motivated if somehow it's OK for that video to be possible. The risk is OBVIOUS.

We're now firing concrete blocks at 360º randomly distributed Teslas on a large circle at 70 mph. Some blocks will be fired at the space between cars, no risk there. And even most of those straight the line of fire have on-board operators able to take avoiding actions, if they pay attention. One of out Tesla has a driver who will miss one concrete block. Let's try and find him. He's in one of those cars....
 
The driver chose to use AP at that junction, even after complaining to people about its performance at that junction multiple times.
So, if not using it was a no-brainer...
I totally agree.
But what's Tesla's role? Allow for a potential flaw to exist triggered by that specific section of road? Or just disable that section and investigate whether we are comfortable with this version of AP to drive through there?

Tesla is bragging about using actual driving data, but it may be worth running this particular situation through a simulation as well. Get to the bottom of this. Why would people need to die first, if no timely action may be taken anyways?
No-one is depending on Tesla to drive them through that faithful section. There is no downside to disabling AP there until satisfied this cannot recur. Downside to just letting AP have at it? Take a wild guess. There is a video demonstrating it now.
If two Tesla can hit stationary fire trucks and fail to notice crossing semi trucks, how are the odds for a concrete lane divider that already killed one person in the safest SUV on the road?
 
Does AP existing now for the Highway Dept to take care of specific stretches of road that are OK for trained eyes, but hazards to untrained AI?

The video could be made and I doubt it was the same day. If I were working in any capacity on the AP software, I'd need a good pep talk to stay motivated if somehow it's OK for that video to be possible. The risk is OBVIOUS.

We're now firing concrete blocks at 360º randomly distributed Teslas on a large circle at 70 mph. Some blocks will be fired at the space between cars, no risk there. And even most of those straight the line of fire have on-board operators able to take avoiding actions, if they pay attention. One of out Tesla has a driver who will miss one concrete block. Let's try and find him. He's in one of those cars....
Hmmmm, you probably have a good point, but I must be having a senior moment as I'm not understanding the reply to quoting me. I'm not suggesting anything. Just in a wait and see mode. :) Seems that some folks are quick to point a finger or draw a conclusion, but I will just wait for the legal result :) or see what Tesla is doing that I don't know about.
 
I haven't seen this posted elsewhere yet, but if you are interested Mark Fong's firm (attorneys for the family) have a For Media section on their website that has a pdf of the complaint as well as audio from their media briefing today at 10am.

For Media - Minami Tamaki LLP

One of the links is pictures of him ans his wife, cause that's important in a legal case.
I don't see the first part getting far since his reports of his complaints regarding AP behavior negate claims 13 and 14..
 
Hmmmm, you probably have a good point, but I must be having a senior moment as I'm not understanding the reply to quoting me. I'm not suggesting anything. Just in a wait and see mode. :) Seems that some folks are quick to point a finger or draw a conclusion, but I will just wait for the legal result :) or see what Tesla is doing that I don't know about.
I'm a fighter for common sense. It's a rare commodity in all things Tesla.
Wait and see is exactly the attitude that is getting people killed here.
Highway Dept should place signage etc.
Tesla should know better than to NOT automatically disable AP after an AP accident.
Let's not pretend they don't have the smarts or skills to program this.
If you get hit in your Tesla, within minutes there's a person the phone with you asking if you're OK! If the car was on AP shortly before, why even wait? It should take a whole second and all next Teslas there would disengage AP while offering warnings of potentially tricky road situation. Not doing so, wait and see, is Chemical Ali like. Nothing to see here, carry on!