Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety

This site may earn commission on affiliate links.
If find it odd that some folks somehow expect AP to "not crash" when it is telling you to take the wheel.
It displays the warnings because it cannot drive for you - therefore if there is a crash it is still your fault for not taking control.
Even if AP is handling steering and TACC, you can still override without specifically turning it off just by turning the wheel or pressing the brake.
At no point do you relinquish responsibility for the safe operation of your car.
So all of these "AP did this to me" stories are just admitting that the driver wasn't paying attention or taking responsibility for their own actions.
 
I hope Tesla can provide a list of AP's weak spots so owners can be more proactive during those situations.
While that may be a valid point in general, in this specific case there was no need for the owner to be proactive because the car was
repeatedly and with increasing urgency warning him to take control. How far do you want to go to compensate for drivers not driving
their damned car?
 
There wheels are direct drive through a single gear ratio transmission.

There is no way for the motors to be running without the wheels turning.

It isn't like a car that the engine can be revving and the car is in park or neutral.
The motor certainly can be running without the wheels turning. It only requires that something is broken in the reduction gear, or that the motor shaft is broken. Something that is plausible in an accident.
 
  • Informative
Reactions: dhanson865
Tesla lack of education on the limit of AP, all they tell the public is what the AP can do, with a fine print it's a beta version and you are 100% responsible to take control over your car at anytime.

Its not "buried" in the fine print--its clearly displayed on the IC every time you engage AP

I hope Tesla can provide a list of AP's weak spots so owners can be more proactive during those situations.

I hope they do not--drivers are supposed to be active participants at all time. When AP encounters something it can't handle, it tells you now, by asking to you take control of the car.
 
The motor certainly can be running without the wheels turning. It only requires that something is broken in the reduction gear, or that the motor shaft is broken. Something that is plausible in an accident.

Agree, but I would be surprised in this case--the impact was on the front corner on the passenger side. The front passenger wheel was broken off, so perhaps the front drivers side driveshaft was also damaged, but I don't see how the rear drive unit and drive shafts where damaged in this manner.
 
Looks like the driver didn't take control of the steering wheel or hit the brake even after the car hit the posts, and he ran away with motor still running? Consider it is dark in the middle of the night, probably driver was not even aware what happened after the accident. Driver is so inattentive that the instinct of grabbing the steering wheel didn't kick in. By now it's quite clear what happened. Autopilot has limitation on hilly roads, so driver has to pay attention. I do notice that Tesla does not seem to do much post crash. Not even sure what Tesla should do in those unfortunate cases when driver is unable to react.
 
... ...
4. CNN's article later about the accident was quoting out of context of oureither Tesla or me should be responsible for the accident. I might consider buying another Tesla only if they can iron out the instability problems of their system.
As a survivor of such a bad accident, a past fan of the Tesla technology, I now realized that life is the most precious fortune in this world. Any advance in technology should be based on the prerequisite of protecting life to the maximum extend. In front of life and death, any technology has no right to ignore life, any pursue and dream on technology should first show the respect to life. For the sake of the safety of all Tesla drivers and passengers, and all other people sharing the road, Mr. Musk should stand up as a man, face up the challenge to thoroughly investigate the cause of the accident, and take responsibility for the mistakes of Tesla product. We are willing to publicly talk to you face to face anytime to give you all the details of what happened. Mr. Musk, you should immediately stop trying to cover up the problems of the Tesla autopilot system and blame the consumers.

1093674775.jpg
1910129431.jpg
image.png

Let's see here, you'll buy another Tesla if they do what? Make it so that you can use it in a manner and situation in which it wasn't intended or designed...
I hope you don't buy another Tesla or a chainsaw or a chop saw. Sounds like the only thing you'd be safe with is a plastic butter knife.

Yes, life is precious, you should be thankful you were driving in the safest car on the planet albeit the reckless way you were doing so.
 
Some of these AP experience stories sound so reckless. When I first tried out AP, I was so overly cautious about what I would trust the car to do. After a week, I saw that it was good at Interstate roads. Week 2 - tried curvy rural roads with hills, immediately realized that we are not there yet by a longshot after it completely disabled autopilot with loud beeps over a really typical hill. it's definitely not ready for prime time in those conditions. I don't see how anyone who experiences that even for 1 mile could possibly engage AP on that kind of road.

OPs description of 600 miles with no problem on interstates and then a crash on a curvy road is exactly what I expect in this generation of car. I didn't need Tesla to tell me that, my own self-preservation instinct did it. Not everyone should be a beta tester.
 
Not everyone should be a beta tester.

This is a sentiment I share to an extent. Beta testers, however you define 'beta' should have an elementary understanding of how the system works, otherwise the system could learn from bad input data.

Or maybe Tesla could build an infinite number of cars and give them to the same number of monkeys. We'd have infallible Autopilot in no time...
 
Let me posit a thought experiment: assume, for a moment, that Autopilot *is* beta (it is) software, and it has bugs (I guarantee you, as a software engineering veteran, that it has hundreds, if not thousands, of bugs): is it not possible that the OP is stating exactly what happened?

If the autopilot got stuck in a loop, or had a memory error (say, from a random alpha particle strike that clobbered just a few more bits than the ECC can handle), or any other from a practically infinite set of failure modes the brilliant programmers at Tesla didn't handle, then it is entirely plausible that his car jerked to the side, and entirely plausible that it happened without warnings and too quickly to react to. Its even plausible that his motor kept running after the accident. It's a software-driven car, all bets are off when you hit a major bug.

I'm a fan, and a believer: autonomic driving will come, sooner than we expect. But it will have bugs, especially as Tesla (and society in general) moves towards stochastic AI-based programming models (meaning, a - we don't know how the code actually works, b - its *stochastic*, which is a fancy word for probabilistic, which means it usually works - you can get close to working 100% of the time, but you can't quite get there).

I really worry Tesla's current approach to handling these situations: "we looked at the telemetry, and it proves X". If there is a bug, then the telemetry can't be trusted. When Tesla says "look, the telemetry shows the driver wasn't steering", that just means thats what the car thought the driver was doing, not what was actually happening.

When we, as a community, jump to defend Tesla, its understandable (we love that company!), but sometimes we should pause and consider that maybe there *is* something there.
 
So, all of us with Autopilot know that sometimes it swerves to the left or right. It is getting better, by a LOT, but still sometimes you see it swerve. I think the obvious responsibility rests on us, the drivers, that knowing that such swerves can happen, keep your damn hands on the wheel. Simple as that. And don't facetweet while driving either.

Personally speaking, I sometimes disable autopilot and drive manually precisely because on certain roads, I don't have 100% confidence in AP's abilities. But then on some roads, especially well marked freeways, it's really awesome too.
 
Let me posit a thought experiment: assume, for a moment, that Autopilot *is* beta (it is) software, and it has bugs (I guarantee you, as a software engineering veteran, that it has hundreds, if not thousands, of bugs): is it not possible that the OP is stating exactly what happened?

If the autopilot got stuck in a loop, or had a memory error (say, from a random alpha particle strike that clobbered just a few more bits than the ECC can handle), or any other from a practically infinite set of failure modes the brilliant programmers at Tesla didn't handle, then it is entirely plausible that his car jerked to the side, and entirely plausible that it happened without warnings and too quickly to react to. Its even plausible that his motor kept running after the accident. It's a software-driven car, all bets are off when you hit a major bug.

I'm a fan, and a believer: autonomic driving will come, sooner than we expect. But it will have bugs, especially as Tesla (and society in general) moves towards stochastic AI-based programming models (meaning, a - we don't know how the code actually works, b - its *stochastic*, which is a fancy word for probabilistic, which means it usually works - you can get close to working 100% of the time, but you can't quite get there).

I really worry Tesla's current approach to handling these situations: "we looked at the telemetry, and it proves X". If there is a bug, then the telemetry can't be trusted. When Tesla says "look, the telemetry shows the driver wasn't steering", that just means thats what the car thought the driver was doing, not what was actually happening.

When we, as a community, jump to defend Tesla, its understandable (we love that company!), but sometimes we should pause and consider that maybe there *is* something there.
The question isn't "could Tesla software contain a bug?" -- of course it could, and, as you suggest, almost certainly does. The
question is: what is the probability of all of the numerous, separate bugs that would be required to explain the reported behavior
manifesting simultaneously and only in this one driver's vehicle? It'd be one thing if you were saying "We know that Bug A has
been observed under circumstance X, and Bug B under circumstance Y, so couldn't it be that this was some interaction of Bugs A
and B in the unfortunate combination of circumstances X and Y?" But there are no such known "building blocks" from which to
construct the necessary complex failure.
 
  • Like
Reactions: WannabeOwner
I am curious the timing between the car telling him to hold the steering wheel and the time of impact. Tesla was a bit ambiguous on this and they could have perfectly been truthful in what they said, while at the same time, there being only a second or two between that warning and the impact.

This is why we keep our hands on the wheel of course.
 
I am curious the timing between the car telling him to hold the steering wheel and the time of impact. Tesla was a bit ambiguous on this and they could have perfectly been truthful in what they said, while at the same time, there being only a second or two between that warning and the impact.

This is why we keep our hands on the wheel of course.

Agree, timing is important. Even if your hands are on the wheel, and the car swerves, you may not have enough time at freeway speeds for your brain to instruct your muscles to take control.