Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another Model X crash, driver says autopilot was engaged

This site may earn commission on affiliate links.
Maybe not.

The EU has certainly perked up after this latest accident: German authority would not have approved beta-phase Tesla autopilot

That's complicated because of this part of it.

"A beta version generally describes a product that has moved from mere functional readiness but still requires improvements for full usability."

The lane steering component of AP is really what's in beta. But, as it stands now the lane steering component works better than any german system on the market.

At the same time the fleet learning aspect of AP is getting over the air updates. Not just in the official updates, but independent of those through tile updates.

So what do you call a system that is continually improved over time?

We're also entering a time where even cars outside of a Tesla have over the air updates. It's not a matter of it, but a matter of when other car manufactures will start to release software that is continually updated. Outside of the automotive market we already live in a time of perpetual beta where manufactures release something before the software has it's entire feature set.

Also keep in mind it took a long time from the time Autopilot hardware started appearing on cars versus actually getting the first version of Autopilot. It took almost a year for that to happen because Tesla was busy beta testing it with their beta testers. So its not like Tesla did do a heck of a lot of testing of it before releasing it onto the masses.
 
When it's all said and done I would argue that for a lane steering system to be effective it has to be in continual beta throughout it's entire life span. That's just the nature of the beast.

The roads change constantly, and there is no way to independently test and validate every update. Especially when the updates happen to any kind of deep learning system.

It would be like saying a human being isn't always in beta. I don't know about anyone else, but I'm still in beta.
 
After 6 months with AutoPilot, I can live without it when it acts up. I just take over and drive normally to keep the vehicle centered in the lane. I look forward to the improvements that Model 3 will bring, hopefully without being a beta tester.
Thanks for being an example of an adult in this forum of kids.

Kids....did you hear that? You don't have to use it if its acting up....or if you don't want to use it.
 
  • Disagree
Reactions: KaiserSoze
What model X are you referring to? I didn't see one.
As I said this video is about a VW Jetta in a similar situation (hitting a concrete divider) as the Model X of this thread had been according to the description of the accident. Many were pointing out how difficult it is to flip the X, so I thought it may help to see how it may have happened.

Obviously in the case of the X accident that started this thread there was no road-rage involved.
 
The former one quoted is the most recent. from today.
Yes, and it should be noted that the Fortune article does include a link to a story reporting the same lack of an investigation. Correct me if I'm wrong, though, the 7/10 Fortune article does eventually seem to grudgingly note that the KBA just doesn't like the term "beta". In light of reports that other (German) products currently on the market perform even worse, I don't know that the article actually adds anything (except to Fortune's page count).

And no - I don't have AP (see my sig, below), and yes - I'm venting because of the frustration of seeing so much FUD here on TMC of all places.

Feh. Makes me want to tell all you kids to get off my lawn....
 
How come you are not scared off the poorly implemented system from Mercedes?

Oh I know. It is so crappy no one is using that.

All the systems should be under scrutiny. I'm not arguing one system is better than the rest. They are all a distraction for the driver and a potential hazard to others on the road. GM has the right approach. Don't put your customer others on the road and shareholders at risk with a beta test.

The NTSHA and NSB and DOT should be all over this to determine whether any of these systems enhance safety or increase distraction.
 
  • Like
Reactions: KaiserSoze
We don't have enough data to make any statistically valid statements on the safety of AutoPilot just yet.

I don't think that's true. It's hard to define the exact number of trials, but if you place a trial at every n miles, you have a lot of trials, so the law of large numbers rapidly tightens the margins of error for both the autopilot and non-autopilot sample sets.
 
  • Informative
Reactions: Ben W
We don't have enough data to make any statistically valid statements on the safety of AutoPilot just yet.

I don't think that's true. It's hard to define the exact number of trials, but if you place a trial at every n miles, you have a lot of trials, so the law of large numbers rapidly tightens the margins of error for both the autopilot and non-autopilot sample sets.

I think vandacca was saying that WE don't have the data - maybe Tesla does, but it isn't sharing the granular data to do a proper statistical comparison. For example, Elon actually used world wide traffic accident statistics versus (mostly US) Tesla AP driving death rates to make a point and I would not call that statistically valid.

To do a proper comparison, you have to know when the last time AP was on before a Tesla crash - if AP was turned off 1 second before a crash, that really should count as an AP crash, and not as a non-AP crash.

You also have to know what kind of roads AP crashes occurred on since freeway fatalities occur almost 1/2 as often as other road fatalities in the US by route miles. Since AP is used much more often on freeways, that would badly skew a comparison.

There may be other things to consider as well. For someone as smart as Elon, I am very disappointed in his "analysis" sound bites that he has offered so far. It isn't rigorous, and it is as misleading as any other car maker's marketing department has been, which is a very low bar.
 
I don't think that's true. It's hard to define the exact number of trials, but if you place a trial at every n miles, you have a lot of trials, so the law of large numbers rapidly tightens the margins of error for both the autopilot and non-autopilot sample sets.

To elaborate on this, even though "one data point" is not enough to pin down the exact objective risk of AP (fatality-wise), it can still be used to establish upper and lower bounds on risk. For instance, if the objective risk of a technology is one fatality per 10 million miles, then the probability of making it 130 million miles with just one fatality is extremely small. (About 0.1%.) So one can reasonably conclude that AP is significantly safer than this. On the flip side, suppose Elon claimed that the objective risk of AP was one fatality per 10 billion miles. The chance that the first fatality would happen so soon would then be about 1%, indicating that it's also likely an inaccurate estimate. So even one data point provides meaningful upper and lower bounds. But is the objective risk less than one fatality per 90 million miles (the US average)? Too soon to tell. Once AP has a billion miles under its belt, we should be able to state one way or another with considerably higher confidence.
 
Last edited: