Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Well, it didn't need pointed out this time. I'm well aware (and have posted) that not detected != not on wheel. Hands should always be on the wheel (detected or not). the timers are due to the issues in detecting, not to allow hands free driving. I was referring specifically to :


Hands not on the wheel will cause hands to not be detected. If you say publicly that the car will let you go 60 seconds while hands are not detected, people will turn that into, "I can leave my hands off the wheel for 59 seconds".

Agree with that. Tesla's sales process also actively encourages owners into believing this is autopilot.
I initially trusted autopilot too much. Off late, as I started doing machine learning at work, my trust in Tesla's system fell.
Even then, I was recently surprised to learn (on this thread) that autopilot has serious issues with stationary object detection. In hindsight, I recall a few incidents where my Model X has not slowed down at stopped cars on a red light with me wondering whether it will stop automatically. Fortunately, in all those cases, I chickened out and applied brakes.

Overall, I stay a Tesla fan though. I wont trade autopilot for anything else. But the system is far too behind the promise with which owners are taking it.
 
This story has become the main thread here for a few weeks here. But I keep scratching my head as to why this accident happened in the first place.

I read in the news that he told his wife and brother that autopilot tried to collide with the concrete when he passes that exact spot. Although not conformed by Tesla, his brother said that he complained to Tesla Service Center that autopilot is not working properly at that location. He also tried to demonstrate to his wife how the autopilot failed to work ( or tried to kill him) at the exact location.

Yet, he turned on autopilot and ignored the warnings to put his hand on the wheel (with torque of course) and failed to see outside his windshield or put his foot on the the break when autopilot acted the way he expected it to act. In the first place, why on earth would anyone turn on the autopilot knowing it is trying to kill him/her at that location? What am I missing here?
It was mentioned to me that maybe he suffered some sort of medical emergency just prior to the collision. The reason there are so many comments on this topic is because there are those who want Tesla to fail, those who own their cars, others are conspiracy theorists and those who have investigative minds and are trying to help find answers.
 
In hindsight, I recall a few incidents where my Model X has not slowed down at stopped cars on a red light with me wondering whether it will stop automatically. Fortunately, in all those cases, I chickened out and applied brakes.
This improved a lot with version 2018.10.4 (I am now on .12). Before that, my experience was that it would pretty much never stop for a car at a stop light that it had not already locked onto before while it was moving. Not sure, but it might have worked when going slower then normal. But after 2018.10.4 it has stop every time for me even at speed > 40mph.
 
@Tam

Here is the letter from NTSB to Tesla.


As far as I can see there are strict rules once the NTSB get involved under which participants cede rights to public release of information during the investigation.

Tesla breached those rules, and no amount of tweeting by Elon decrying those rules is going to make them change.

It really is a bit of an own goal that could have been avoided if Tesla had simply said "We cannot release further information until the NTSB investigation is complete".

The subsequent tweets by Elon poured oil on troubled waters and poked the sleeping bear. To not expect some kick back was naive.
 

Attachments

  • 376217135-NTSB-Letter-to-Tesla.pdf
    51.6 KB · Views: 71
This improved a lot with version 2018.10.4 (I am now on .12). Before that, my experience was that it would pretty much never stop for a car at a stop light that it had not already locked onto before while it was moving. Not sure, but it might have worked when going slower then normal. But after 2018.10.4 it has stop every time for me even at speed > 40mph.

Starting with 17.17.4 it started reacting to untracked stopped vehicles about 75% of the time at 30mph. 0% above 35mph. It steadily improved until 2017.42 where it could do 90-95% of cars at 35mph but failed above 40mph. Now it can do around 50mph at around 95% (had a couple where I didn't want to stick around to see if it would slam on the brakes but otherwise solid since 2018.10.4). I've even had it work at 60mph but if it doesn't start slowing, I take over to ensure a smoother deceleration than it would do if it stopped at all.
 
  • Informative
Reactions: daktari and Canuck
I'm with Economite here.
The attitude should be cooperate with the respected people, of the respected agency.
Do not make this about politics, or a PR battle vs NHTSA.

Tesla has been given a pass thus far. From Josh Brown, to the firetruck, to many wacked youtube vids of bad behaviors by car and driver.

Tesla has been given the latitude to publicly beta test the AP work in progress.
That hasn't changed, but if they piss off some congresscritter behind in the polls...

Time to chill, be patient, crank out cars.
 
@Tam

Here is the letter from NTSB to Tesla.


As far as I can see there are strict rules once the NTSB get involved under which participants cede rights to public release of information during the investigation.

Tesla breached those rules, and no amount of tweeting by Elon decrying those rules is going to make them change.

It really is a bit of an own goal that could have been avoided if Tesla had simply said "We cannot release further information until the NTSB investigation is complete".

The subsequent tweets by Elon poured oil on troubled waters and poked the sleeping bear. To not expect some kick back was naive.

I think it just means that Tesla decided that because of statements coming out from the victim's family and now lawyer that Tesla didn't feel it could any longer be accused of a defective product as the cause to the accident and had to speak out. By not having the victim/family as a party to the NTSB investigation (guessing they are not and thinking they were interviewed by NTSB and NTSB got information from them), it let's them say what they will before the NTSB report is issued without consequences on their part. Not a very level playing field.

I seem to remember that the family or lawyer said that they wouldn't file a lawsuit until the report came out, but earlier in the week what I read seemed to contradict that by signaling they were going to sue Tesla for a defective product. I have to say that the family, while I understand their grief and loss, has played to the press during all this. If you read the thread I just posted (in this forum section) on industry lawyers and the recent American Bar Association conference where liability for semi-and autonomous driving systems was discussed, there is driver negligence and product defect as a basis. Knowing that the driver claimed he had encountered an AP issue there on 7-10 occasions and then continued to drive that section with it on and, we can only assume without driver attention to the road or taking control when it deviated since the accident occured, we can see driver negligence raised. No rational person places themselves in that situation and doesn't take control of the car knowing likely what the outcome will be.

The fact that he drove by that spot and it wasn't a consistent problem per his wife (he worked at Apple for a number of months going that route likely every work day) makes me feel it wasn't a defect but maybe sunlight on the sensor possibly or some other issue like line markings being read.
 
Last edited:
...Tesla breached those rules, and no amount of tweeting by Elon decrying those rules is going to make them change...

There is no question that Tesla violated NTSB silence rule.

I agree that if Tesla just agreed to NTSB's participating party rule, Tesla wouldn't be kicked out.

But I still think Tesla's leaking of information is saving lives because:

1) Some owners didn't believe the driver used Autopilot.

2) And if the driver did use Autopilot, there's no need to control the steering wheel.

3) And if the driver didn't hold the steering wheel, there's no way that Autopilot could cross a single solid white line.

4) And if Autopilot didn't cross a single solid white line, there's no way that it could cross into a gore point.

5) And if Autopilot did cross into a gore point, there's no way it could not stop on its own because of AEB...

Now that Tesla has leaked the information, owners are more aware of Autopilot's limitations.

6) In addition, the freeway impact attenuator was not fixed for 11 days and after this accident, it was still not fixed!

Only after Tesla has leaked that information publicly, Caltrans made it an urgency to fix that right away.
 
Agree with that. Tesla's sales process also actively encourages owners into believing this is autopilot.

I don't understand how people can say this. Did you not take the car out for a test drive with a sales person? I understood within my first visit to Tesla that Autopilot is not autopilot. The other big clue is that you have to pay an additional $4000 to upgrade to FSD (when available). If it was already autopilot what would be the need for an additional upgrade?
 
The fact that he drove by that spot and it wasn't a consistent problem per his wife (he worked at Apple for a number of months going that route likely every work day) makes me feel it wasn't a defect but maybe sunlight on the sensor possibly or some other issue like line markings being read.

Unfounded pet theory: If there is a car close in front, it tracks the left gore line due to limited data and missing right line paint. If there is a long field of view, it ignores the missing paint and follows the right gore line.
 
  • Like
Reactions: greghanssen
Tesla themselves changed their mind on this accident disclosure thing. In their initial blog post on this matter, they said

"Out of respect for the privacy of our customer and his family, we do not plan to share any additional details until we conclude the investigation."

What they're doing now is very hypocritical. Seems like they changed their mind when it became convenient, when they found out a way to point the finger at the driver instead of Autopilot. I don't buy Tesla's argument that they're sharing this for the safety of the consumer. They certainly didn't care about privacy because most of the second article made public embarassing information on how the driver was using the car. They shared the information because it was good PR. I'd be interested in knowing what information Tesla has not yet shared instead.
 
Tesla themselves changed their mind on this accident disclosure thing. In their initial blog post on this matter, they said

"Out of respect for the privacy of our customer and his family, we do not plan to share any additional details until we conclude the investigation."

What they're doing now is very hypocritical. Seems like they changed their mind when it became convenient, when they found out a way to point the finger at the driver instead of Autopilot. I don't buy Tesla's argument that they're sharing this for the safety of the consumer. They certainly didn't care about privacy because most of the second article made public embarassing information on how the driver was using the car. They shared the information because it was good PR. I'd be interested in knowing what information Tesla has not yet shared instead.

Tesla got the data recorder, read the data recorder, interpretated the data, and released the results. I think their investigation was over.

Message: Consumers are safer with AP, but it may not save them from themselves if they don't pay attention.
 
Tesla got the data recorder, read the data recorder, interpretated the data, and released the results. I think their investigation was over.

... some of the results.

I for one would like to know the steering wheel angle for the 5 seconds preceding the impact. But then again Tesla’s not going to come out and say “yes, AP drove him right into the barrier”... the preferred narrative (which I totally agree with) is that the driver was inattentive.
 
...we do not plan to share any additional details until we conclude the investigation...

There was great speculation of whether Autopilot was involved.

Some was saying that because the driver was aware that Autopilot repeatedly swerving toward that gore point, so it was not believable that the driver was using it this time.

Tesla couldn't confirm nor deny that speculation so it was fair that it needed to access the vehicle log first, finish reading it first then it could make its conclusion whether Autopilot was involved or not.

Thus, Tesla kept its promise! There's no contradiction in its promise!
 
I think it just means that Tesla decided that because of statements coming out from the victim's family and now lawyer that Tesla didn't feel it could any longer be accused of a defective product as the cause to the accident and had to speak out. By not having the victim/family as a party to the NTSB investigation (guessing they are not and thinking they were interviewed by NTSB and NTSB got information from them), it let's them say what they will before the NTSB report is issued without consequences on their part. Not a very level playing field.

When a company decides to be a party to an NTSB investigation (as opposed to just a source of information) the company is making a choice. It is opting to have direct access to investigative discussion as they occur and in exchange it agrees to not make statements about the topic of the investigation until after the investigation is completed.

There's an important reason for this rule. In general there will be a number of parties to an investigation, each of which may have contributed in some way to the mishap. Think, for example, of an airliner accident, where (at a minimum) the airline, the airplane manufacturer, and the pilots union will all be parties. These parties all provide frank information to the NTSB during the investigation, and have access to the information provided by other parties. Thus, if they speak publically before the report comes out, they are in the position to be cherry picking preliminary information provided within the investigation by the other parties. If parties are allowed to make such "leaks", no one will be willing to speak frankly during the investigation.

Tesla accepted a position where it would have access to this non-public preliminary investigation information, and agreed to the ground rule about no-public-statements. Then, while remaining a participant, it started making the exact kind of statements that are the reason for the rule: Statements that Tesla had no fault and the fault was entirely that of the driver (and Caltrans). The statements cherry-picked the available evidence. That's exactly what Tesla is not allowed to do: Get access to the non-public investigative information as a party, and then start talking about the accident. If it initially had chosen just to provide information to NTSB, but not to be a party receiving preliminary information, then NTSB wouldn't be having a sh!t-fit about Tesla's statements. By putting itself in a position where was receiving non-public information, Tesla took advantage of a non-level playing field when it spoke in public.
 
I keep thinking all the possibilities that why Mr. Huang got himself into such trouble? Time after time, I came up with only one plausible scenario:
Assumptions:
1. Everything Tesla reported regarding Mr. Huang failed to take actions to takeover the autopilot is true
2. Mr. Huang is a very intelligent Apple engineer
3. Mr. Huang has a good driving track record according to his wife
4. Mr. Huang just got updated autopilot software 2018.10.4
5. Mr. Huang is aware of the trouble spot that previous autopilot software he experienced
6. Mrs. Huang gave her interview that her husband attempted to show her the encounter he has during his morning commute but the autopilot did not always veer toward the barrier while she was with him (possible due to different sunlight in the afternoon for example)

Now, Based on above assumptions, Mr. Huang is very curious about his "NEW" autopilot update so one can imagining as we all witnessed on several Youtube channels that people are also curious about the new update. All he hoped for was to have the new software once for all and finally resolved this troubled spot. You can imaging, like many Tesla owners, Mr. Huang love his model x and what Mr. Huang may have done at that moment in time? This is very costly experiment (my reasoning) for him and his family. My heart goes out to Huang's family.

Disclaimer: Above conjecture does not represent what actually happened to Mr. Huang it is a speculation on the part of author.
 
  • Like
Reactions: kkboss
There was great speculation of whether Autopilot was involved.

Tesla accepted a position where it would have access to this non-public preliminary investigation information, and agreed to the ground rule about no-public-statements. Then, while remaining a participant, it started making the exact kind of statements that are the reason for the rule: Statements that Tesla had no fault and the fault was entirely that of the driver (and Caltrans). The statements cherry-picked the available evidence. That's exactly what Tesla is not allowed to do: Get access to the non-public investigative information as a party, and then start talking about the accident. If it initially had chosen just to provide information to NTSB, but not to be a party receiving preliminary information, then NTSB wouldn't be having a sh!t-fit about Tesla's statements. By putting itself in a position where was receiving non-public information, Tesla took advantage of a non-level playing field when it spoke in public.

There's still a lot of speculation, and even more unanswered questions even after Tesla leaked. Hence 115 pages in this thread.

Cherry-picking is a good description of what Tesla did. The phrasing of Tesla's statement was very one-sided and self-serving, and maximally embarrassing to the victim and his family (i.e. he was reckless). Granted, the family is doing the same thing now to Tesla, but I think Tesla made the statement first? 90% of that Tesla's blog post on that matter was on what the driver failed to do (not holding on to the wheel, not responding given 5 seconds of time, etc.). But there was not a single direct statement on what AP failed to do. Specifically, that "autopilot failed to maintain the car in its proper lane or brake in time before the collision," even though that is the only reasonable conclusion you can infer. Yes I know that AP is not perfect and will make mistake, but that's not the problem here with the statement. The issue is that Tesla purposely glossed over this fact in putting blame on the driver.
 
Suggestion:
Tesla should conduct a through test on the road marking at the crash site with various conditions such as different time of day/night and rain etc...

Cadillac also should get their Super Cruise check out as well

Burning question:
Does Tesla's autopilot failed to take actions to avoid the imminent collision to the barrier? This is a safety feature Tesla owners deserved regardless of the driver failed to respond to the warnings from autopilot. What if the driver is incapacitated at that time (heart attack, for example...)?
 
... some of the results.

I for one would like to know the steering wheel angle for the 5 seconds preceding the impact. But then again Tesla’s not going to come out and say “yes, AP drove him right into the barrier”... the preferred narrative (which I totally agree with) is that the driver was inattentive.
If the driver is inattentive, it does not make it right for autopilot drive toward the wall, right?