AP is restricted to speed limit + 5mph.AP is restricted to 50mph max on those roads from what I can tell,
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
AP is restricted to speed limit + 5mph.AP is restricted to 50mph max on those roads from what I can tell,
While that may be a valid point in general, in this specific case there was no need for the owner to be proactive because the car wasI hope Tesla can provide a list of AP's weak spots so owners can be more proactive during those situations.
The motor certainly can be running without the wheels turning. It only requires that something is broken in the reduction gear, or that the motor shaft is broken. Something that is plausible in an accident.There wheels are direct drive through a single gear ratio transmission.
There is no way for the motors to be running without the wheels turning.
It isn't like a car that the engine can be revving and the car is in park or neutral.
Tesla lack of education on the limit of AP, all they tell the public is what the AP can do, with a fine print it's a beta version and you are 100% responsible to take control over your car at anytime.
I hope Tesla can provide a list of AP's weak spots so owners can be more proactive during those situations.
...Not sure exactly where this happened, but it seems the below is a likely spot given the clues above...
The motor certainly can be running without the wheels turning. It only requires that something is broken in the reduction gear, or that the motor shaft is broken. Something that is plausible in an accident.
... ...
4. CNN's article later about the accident was quoting out of context of oureither Tesla or me should be responsible for the accident. I might consider buying another Tesla only if they can iron out the instability problems of their system.
As a survivor of such a bad accident, a past fan of the Tesla technology, I now realized that life is the most precious fortune in this world. Any advance in technology should be based on the prerequisite of protecting life to the maximum extend. In front of life and death, any technology has no right to ignore life, any pursue and dream on technology should first show the respect to life. For the sake of the safety of all Tesla drivers and passengers, and all other people sharing the road, Mr. Musk should stand up as a man, face up the challenge to thoroughly investigate the cause of the accident, and take responsibility for the mistakes of Tesla product. We are willing to publicly talk to you face to face anytime to give you all the details of what happened. Mr. Musk, you should immediately stop trying to cover up the problems of the Tesla autopilot system and blame the consumers.
Not everyone should be a beta tester.
The question isn't "could Tesla software contain a bug?" -- of course it could, and, as you suggest, almost certainly does. TheLet me posit a thought experiment: assume, for a moment, that Autopilot *is* beta (it is) software, and it has bugs (I guarantee you, as a software engineering veteran, that it has hundreds, if not thousands, of bugs): is it not possible that the OP is stating exactly what happened?
If the autopilot got stuck in a loop, or had a memory error (say, from a random alpha particle strike that clobbered just a few more bits than the ECC can handle), or any other from a practically infinite set of failure modes the brilliant programmers at Tesla didn't handle, then it is entirely plausible that his car jerked to the side, and entirely plausible that it happened without warnings and too quickly to react to. Its even plausible that his motor kept running after the accident. It's a software-driven car, all bets are off when you hit a major bug.
I'm a fan, and a believer: autonomic driving will come, sooner than we expect. But it will have bugs, especially as Tesla (and society in general) moves towards stochastic AI-based programming models (meaning, a - we don't know how the code actually works, b - its *stochastic*, which is a fancy word for probabilistic, which means it usually works - you can get close to working 100% of the time, but you can't quite get there).
I really worry Tesla's current approach to handling these situations: "we looked at the telemetry, and it proves X". If there is a bug, then the telemetry can't be trusted. When Tesla says "look, the telemetry shows the driver wasn't steering", that just means thats what the car thought the driver was doing, not what was actually happening.
When we, as a community, jump to defend Tesla, its understandable (we love that company!), but sometimes we should pause and consider that maybe there *is* something there.
I am curious the timing between the car telling him to hold the steering wheel and the time of impact. Tesla was a bit ambiguous on this and they could have perfectly been truthful in what they said, while at the same time, there being only a second or two between that warning and the impact.
This is why we keep our hands on the wheel of course.