Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
A picture or video is (more or less) raw evidence. A description of the picture or video is an interpretation of the raw evidence, and not itself raw evidence. A statement like "X would have been able to see Y" is clearly drawing a conclusion, not providing raw evidence.

Gonna have to disagree. I mean they could say the camera showed a clear image of the barrier from 150 meters away, that would be more factual and mean the same thing. It's not like they are saying, the runner stepped out of bounds, or that pitch was a strike. There is either something else in the frame or not. (Or are you allowing for temporary blindness on the part of the driver?)

Furthermore, while I strongly disagree with you that what Tesla has been providing is merely "raw evidence", Tesla would still be breaking the party-silence rule even if it were only disclosing raw evidence. And Tesla's public statements are clearly not a complete data dump (which would still be impermissible); it is choosing which bits data to release. That in itself is a clear form of spin. And remember, the data that Tesla is releasing doesn't really belong to Tesla. It is taken from the car's data recorder (which is owned by the driver's estate and in the possession of NTSB). But Tesla has made it so that the owner's family can't see/interpret the data collected by the driver's own property. And then Tesla is going off and releasing it's own cherry-picked interpretations of that data.

What bits of data are you looking for/ think are missing?

First of all; as you admit, this is an interpretation. Therefore it is not raw data.

Yah, and that was from the news report, not the data I was referencing from the blog.

Furthermore, while it may be an accurate conclusion, it is by no means the only accurate conclusion that can be drawn from the data. Someone could just as accurately say: "the only way for this accident to have occurred is if AP failed to stay in lane, despite the activation of autosteer, and then Mr. Huang failed to notice that AP had crossed the lane markings." These sorts of conclusions always have a spin. That's why, in its final reports, NTSB identifies all factors involved in a mishap.

No, there are other scenarios which are not very nice to suggest. However, your longer example does not contradict any of the information from Tesla. AP on: yes, driver failed to act: yes. Your example does make assumptions: "failed to stay in lane" assumes the lack of clear lane marking interpretation is the fault of autosteer vs Caltrans maintenance.

Tesla like to stop it's analysis at either "the driver deactivated AP before the collision, therefore AP was not at fault" (ignoring the fact that AP steered the car into the dangerous situation, and the driver only "deactivated" AP by panic breaking too close to the point of impact) or "the driver failed to intervene when he had the opportunity to do so" (ignoring the fact that it was AP's steering mistake, or failure to steer, that put the car on course to the collision). AP clearly (at the very least) contributes to accidents that occur while AP is activated or that involve a driver panic breaking/swerving to override AP. That doesn't necessarily mean that Tesla should be legally liable (though I think quite often it should be) or that AP is defective/unsafe (though I suspect it is), but it does mean that AP's involvement in such mishaps needs to be impartially studied.

What accidents and suggested quotes/ meanings by Tesla are you referring to here?
 
What bits of data are you looking for/ think are missing?

That doesn't really matter. I don't know what I don't know. That's the whole point of making all available raw data available to an independent investigator and letting them sort it all out and poke at all assumptions regarding proper interpretation, rather than having interested parties spin cherry-picked data, interpretations and conclusions to the public based on non-public information (like the information in the proprietary data recorder).
 
  • Like
Reactions: Matias and NerdUno
No, there are other scenarios which are not very nice to suggest. However, your longer example does not contradict any of the information from Tesla. AP on: yes, driver failed to act: yes. Your example does make assumptions: "failed to stay in lane" assumes the lack of clear lane marking interpretation is the fault of autosteer vs Caltrans maintenance.

And this is why NTSA doesn't like folks releasing any data or interpretations prior to the final, complete report. Release of a handful of facts (even if they actually are facts, rather than interpretations) lacks the context to give a complete description of what happened and why. Such incomplete statements are not helpful towards the goal of improving transportation safety.
 
Gonna have to disagree. I mean they could say the camera showed a clear image of the barrier from 150 meters away, that would be more factual and mean the same thing. It's not like they are saying, the runner stepped out of bounds, or that pitch was a strike. There is either something else in the frame or not. (Or are you allowing for temporary blindness on the part of the driver?)

It is very rare for anything on a highway to be 100% clear view; most things come in and out of being partially or fully obstructed over time. Furthermore, to say that there is a 100% clear view (even if true) doesn't really tell the full story if there are other things in the field of vision that might distract attention from the thing that is clear view.

This is why there is a big difference between a statement like what Tesla made and a release of actual photos/videos. Tesla might be correct that there was a clear view from 150 meters away. But that doesn't really say what else is in the photo/video.

As an example (and I am not saying this is what happened or even is likely to be what happened) what if the Tesla was following a car that, at 150 meters from the crash point, suddenly realized it wasn't in the right place, made a sudden swerve into the lane that would stay on the main highway, and nearly caused an accident in that lane. In that situation, Tesla's description would be factually correct, but would not recognize that there was a reason why the Tesla driver was distracted.
 
It is very rare for anything on a highway to be 100% clear view; most things come in and out of being partially or fully obstructed over time. Furthermore, to say that there is a 100% clear view (even if true) doesn't really tell the full story if there are other things in the field of vision that might distract attention from the thing that is clear view.

This is why there is a big difference between a statement like what Tesla made and a release of actual photos/videos. Tesla might be correct that there was a clear view from 150 meters away. But that doesn't really say what else is in the photo/video.

As an example (and I am not saying this is what happened or even is likely to be what happened) what if the Tesla was following a car that, at 150 meters from the crash point, suddenly realized it wasn't in the right place, made a sudden swerve into the lane that would stay on the main highway, and nearly caused an accident in that lane. In that situation, Tesla's description would be factually correct, but would not recognize that there was a reason why the Tesla driver was distracted.

I think I get your drift. Fully data disclosure to allow analysis of confounding, mitigating, or exacerbating factors versus factual but limited release of some data? I can dig that.
 
Can't speak for Honda, but the Ford lane assist only nudges you if you drift. If you set the car up on a straight road with good markings, it will almost keep you out of the ditch on its own. The adaptive cruise is nice when traffic is moving, but does not seem to fully stop the car.
We have it on our Honda Odyssey and it definitely steer the car when it can sense road markings. It also has lane departure system that works similar to the Ford system you mention, which will nudge the car back into its lane. What's amusing is seeing the lane departure correct the auto steer system, which does happen from time to time. Of course, my hands are on the steering wheel all the time, since I don't trust these systems at all.
 
Such incomplete statements are not helpful towards the goal of improving transportation safety.

Tesla said what it said to protect current and future Tesla owner lives by using this occasion to remind them at a very teachable moment that when you engage AP you better keep your hands on the wheel and pay attention and be ready to take over at any time, because if you dont you could die.

The consequences of not being invited to the months long accident reconstruction party is a small price to pay for potentially saving lives.

Maybe their drawing attention to Caltrans negligence too will save some lives if it causes more scrutiny of Caltrans and causes them to reset barriers faster than when they feel like it.
 
Last edited:
I don't understand how people can say this. Did you not take the car out for a test drive with a sales person? I understood within my first visit to Tesla that Autopilot is not autopilot. The other big clue is that you have to pay an additional $4000 to upgrade to FSD (when available). If it was already autopilot what would be the need for an additional upgrade?


Actually, I did pay the extra money to get FSD (when available). I walked away with the impression that it was all soon to happen (within an year). Now, I know better.
 
Actually, I did pay the extra money to get FSD (when available). I walked away with the impression that it was all soon to happen (within an year). Now, I know better.

That's not what the original quote was referencing. The original post stated that Tesla was selling the idea to people that EAP was self driving (I.E. FSD). I don't know how anyone could that impression if they took a test drive. A few minutes in the car and it is evident you must keep your hands on the wheel and that EAP isn't going to do a lot of things for you.
 
Tesla said what it said to protect current and future Tesla owner lives by using this occasion to remind them at a very teachable moment that when you engage AP you better keep your hands on the wheel and pay attention and be ready to take over at any time, because if you dont you could die.

That's a strange interpretation of the thrust of Tesla's statements. Tesla didn't use its statements to caution/remind Tesla owners that they should take special care using AP in areas with shifting and contrastinglu colored road surfaces, since AP might follow the road transition rather than the lane line. And they didn't caution/remind owners that AP will not steer to avoid collisions with a fixed object in its path. Nor did they explain circumstances where a follow distance of one might be inappropriate. There was no educating at all about how AP works or why it might set a course for a barrier. Just a statement that the driver would have been able to avoid the accident if he were paying attention.

Indeed, the overall thust of the message was not to encourage folks to be more cautious/judicious about using AP. Instead Tesla explicitly stated that AP is so good that drivers reduce their safety by not using it more.

The statements were clearly an attempt to cast all blame on the driver and cameras, rather than educate drivers about the need to be cautious and judicious about using AP. Tesla continues to give drivers no real guidance about when or how AP should be used, aside from the guidance that it can't be trusted to work reliably, so the driver needs to always be ready to step in.
 
That's a strange interpretation of the thrust of Tesla's statements. Tesla didn't use its statements to caution/remind Tesla owners that they should take special care using AP in areas with shifting and contrastinglu colored road surfaces, since AP might follow the road transition rather than the lane line. And they didn't caution/remind owners that AP will not steer to avoid collisions with a fixed object in its path. Nor did they explain circumstances where a follow distance of one might be inappropriate. There was no educating at all about how AP works or why it might set a course for a barrier. Just a statement that the driver would have been able to avoid the accident if he were paying attention.

Indeed, the overall thust of the message was not to encourage folks to be more cautious/judicious about using AP. Instead Tesla explicitly stated that AP is so good that drivers reduce their safety by not using it more.

The statements were clearly an attempt to cast all blame on the driver and cameras, rather than educate drivers about the need to be cautious and judicious about using AP. Tesla continues to give drivers no real guidance about when or how AP should be used, aside from the guidance that it can't be trusted to work reliably, so the driver needs to always be ready to step in.

That is an interesting interpretation. That isn’t what I got from the release. What I took from it was a reinforcement of the manual that the driver of a Tesla vehicle needs to be paying attention.
 
That's not what the original quote was referencing. The original post stated that Tesla was selling the idea to people that EAP was self driving (I.E. FSD). I don't know how anyone could that impression if they took a test drive. A few minutes in the car and it is evident you must keep your hands on the wheel and that EAP isn't going to do a lot of things for you.

Perhaps you should watch the videos that Elon was pedaling at the time Tesla began offering FSD as a purchase option. No mention of it being pure vaporware. Just waiting on the government bureaucrats to bless it.
 
it can't be trusted to work reliably, so the driver needs to always be ready to step in.

Yes. Exactly the driver needs to be ready to step in.

Except revise that to say, in your case, "it can't be trusted to work as I imagine it to work so the driver needs to always be ready to step in."

revise it to say, for others, "it works as designed, so the driver needs to always be ready to step in" -- As stated in the message from Tesla to the driver, every time AP is engaged!
.
And it's funny that you seem to find that "so the driver needs to be always ready to step in" to be the mic drop point of your argument.

The most persuasive point you are making is that Tesla AP is just not right for people unable or unwilling to understand how to use tools within their specified functionality.

I guess some people are destined to never find utility in Tesla AP and certainly not any other < L4/5 driver assistance tool.

The simple concept of car-following and lane-keeping-assistance, requiring continued monitoring of road and traffic conditions, while being relieved of the need to make minor-steering adjustments (to stay between the marked lanes) and minor-accel and brake adjustments (to stay behind the moving car in front) clearly is just confounding for some people.
 
Last edited:
Tesla stated facts.

Everyone else is applying interpretation.

I am grateful Tesla released these facts as promptly as they did, as I now factor this into my use of my vehicle.
In this regard Tesla have made my vehicle safer in a very timely manner.
As I have said before the complete analysis can properly come in the fullness of time.

For the life of me I have no idea why NTSB threw their toys out of the pram over this.
But I can certainly see why Tesla might be aggrieved at their actions and perceive an agenda.
I would much rather they both patched up their differences and moved forward constructively together which will ultimately end up with the best result for all.
 
...The statements were clearly an attempt to cast all blame on the driver...

I am fine with that if that is all of what owners got from the message: If owners use Autopilot, and when something goes wrong, owners are at fault.

That is how it is currently designed. If owners like it, buy it. If not, save your money, don't buy it and save your life as well as the honor of your name!

Some people read owners manual, some don't.

Some people know how to follow instructions and some just not that good at being told.

Similar to aviation autopilot, whenever there's a plane crash flying with autopilot, government system doesn't leave dead human pilots alone.

In aviation, investigations after investigations so far have concluded that human pilots were at fault even when a plane crash event was clearly initiated by an autopilot hardware component--The chain of deadly event began for Air France Flight 447 in 2009 with failed Airspeed Indicators or pitot (pronounced as "pea toh") tubes.

And if people keeps wondering why the name "autopilot"? That what it means: whenever hardware/software fails in aviation autopilot, human operators are to be blamed historically up to today.
 
Last edited:
revise it to say, for others, "it works as designed, so the driver needs to always be ready to step in" -- As stated in the message from Tesla to the driver, every time AP is engaged!

Are you saying AP worked as designed in this case? AP is designed to keep you in your lane. In this situation, it failed. In fact, it seems that it actively took the car out of its lane and drove it into a barrier.

Now are there limitations to how the system works? Obviously. Sun blocking the cameras, poor lane markings, other cars, etc will affect performance.
 
Yes. Exactly the driver needs to be ready to step in.

Except revise that to say, in your case, "it can't be trusted to work as I imagine it to work so the driver needs to always be ready to step in."

revise it to say, for others, "it works as designed, so the driver needs to always be ready to step in" -- As stated in the message from Tesla to the driver, every time AP is engaged!
.
And it's funny that you seem to find that "so the driver needs to be always ready to step in" to be the mic drop point of your argument.

The most persuasive point you are making is that Tesla AP is just not right for people unable or unwilling to understand how to use tools within their specified functionality.

I guess some people are destined to never find utility in Tesla AP and certainly not any other < L4/5 driver assistance tool.

I'm willing to cut Tesla a little more slack after the 10.4 release, but suggesting that "be[ing] ready to step in" was sufficient attention in prior releases doesn't begin to describe the situation that most AP2 drivers experienced. I likened it to having a toddler riding in your lap holding the steering wheel. And by that I mean it wasn't normal alertness that was required. It was a much higher level of attention with the assumption that AP2 or a toddler could do something unexpected at any moment that could actually kill you.
 
...Are you saying AP worked as designed in this case?...

Yes!

The answer is: "The software and hardware performed within specs and as designed."

The design and the goal are 2 different things.

The goal is perfection while the design is what engineers are capable of putting their human hours into the work and incrementally achieve one competency after another.

Remember, when I bought my Autopilot, it only worked up to 45 MPH and if I got killed for driving Autopilot too slowly in 70 MPH zone?

Yes, it was designed to be kept maximum at 45 MPH and there's no question about it!

But is that the goal? No! The goal is not to let me be ran over by speeding cars and killed! The goal would be to increase the cap later on but not achievable by then!

For those who bought Autopilot in October 2016, they got no Autopilot function at all! Nothing! Was that the design?

Yes! That's the maximum human engineers could do at that time to allow owners to enjoy their money of buying autopilot: NOTHING!

But was that the goal? No. The goal is to let owners actually enjoy autopilot function but that had to wait later but not achievable then.

Same application with this case. The goal is to keep the driver alive. But the design needs constant updates to perfection and getting out of beta quality in future.
 
Last edited:
Similar to aviation autopilot, whenever there's a plane crash flying with autopilot, government system doesn't leave dead human pilots alone.

There are a couple of glaring differences to the aviation situation. First, pilots are actually trained on autopilot. Second, most pilots using autopilot would qualify as professional drivers that are being paid. Both of those legally place a much higher standard of care on the pilot. Then we have Tesla owners that thought they bought a safe car and instead get a new "AutoPilot surprise" or two every time a new software update is released with zero documentation much less training.