If find it odd that some folks somehow expect AP to "not crash" when it is telling you to take the wheel. It displays the warnings because it cannot drive for you - therefore if there is a crash it is still your fault for not taking control. Even if AP is handling steering and TACC, you can still override without specifically turning it off just by turning the wheel or pressing the brake. At no point do you relinquish responsibility for the safe operation of your car. So all of these "AP did this to me" stories are just admitting that the driver wasn't paying attention or taking responsibility for their own actions.
While that may be a valid point in general, in this specific case there was no need for the owner to be proactive because the car was repeatedly and with increasing urgency warning him to take control. How far do you want to go to compensate for drivers not driving their damned car?
Pang, you should not have used AP on a narrow and curvy mountain road like that. That's not what it's currently designed to do. In such situations you should drive manually. This is pretty clearly conveyed in various warnings.
The motor certainly can be running without the wheels turning. It only requires that something is broken in the reduction gear, or that the motor shaft is broken. Something that is plausible in an accident.
Its not "buried" in the fine print--its clearly displayed on the IC every time you engage AP I hope they do not--drivers are supposed to be active participants at all time. When AP encounters something it can't handle, it tells you now, by asking to you take control of the car.
I think these are near the same place for anyone that wants to look at Google street view. Google Maps Notice the arrow marker on the right warning of a leftward curve ahead. and a bit further down the road. Google Maps
Agree, but I would be surprised in this case--the impact was on the front corner on the passenger side. The front passenger wheel was broken off, so perhaps the front drivers side driveshaft was also damaged, but I don't see how the rear drive unit and drive shafts where damaged in this manner.
Looks like the driver didn't take control of the steering wheel or hit the brake even after the car hit the posts, and he ran away with motor still running? Consider it is dark in the middle of the night, probably driver was not even aware what happened after the accident. Driver is so inattentive that the instinct of grabbing the steering wheel didn't kick in. By now it's quite clear what happened. Autopilot has limitation on hilly roads, so driver has to pay attention. I do notice that Tesla does not seem to do much post crash. Not even sure what Tesla should do in those unfortunate cases when driver is unable to react.
Let's see here, you'll buy another Tesla if they do what? Make it so that you can use it in a manner and situation in which it wasn't intended or designed... I hope you don't buy another Tesla or a chainsaw or a chop saw. Sounds like the only thing you'd be safe with is a plastic butter knife. Yes, life is precious, you should be thankful you were driving in the safest car on the planet albeit the reckless way you were doing so.
Some of these AP experience stories sound so reckless. When I first tried out AP, I was so overly cautious about what I would trust the car to do. After a week, I saw that it was good at Interstate roads. Week 2 - tried curvy rural roads with hills, immediately realized that we are not there yet by a longshot after it completely disabled autopilot with loud beeps over a really typical hill. it's definitely not ready for prime time in those conditions. I don't see how anyone who experiences that even for 1 mile could possibly engage AP on that kind of road. OPs description of 600 miles with no problem on interstates and then a crash on a curvy road is exactly what I expect in this generation of car. I didn't need Tesla to tell me that, my own self-preservation instinct did it. Not everyone should be a beta tester.
Sorry Pang. When I see that graded curved road with a tight lane I cannot see how anyone would trust AP, especially at night! Like Amallah say's: Not everyone should be a beta tester.
This is a sentiment I share to an extent. Beta testers, however you define 'beta' should have an elementary understanding of how the system works, otherwise the system could learn from bad input data. Or maybe Tesla could build an infinite number of cars and give them to the same number of monkeys. We'd have infallible Autopilot in no time...
Let me posit a thought experiment: assume, for a moment, that Autopilot *is* beta (it is) software, and it has bugs (I guarantee you, as a software engineering veteran, that it has hundreds, if not thousands, of bugs): is it not possible that the OP is stating exactly what happened? If the autopilot got stuck in a loop, or had a memory error (say, from a random alpha particle strike that clobbered just a few more bits than the ECC can handle), or any other from a practically infinite set of failure modes the brilliant programmers at Tesla didn't handle, then it is entirely plausible that his car jerked to the side, and entirely plausible that it happened without warnings and too quickly to react to. Its even plausible that his motor kept running after the accident. It's a software-driven car, all bets are off when you hit a major bug. I'm a fan, and a believer: autonomic driving will come, sooner than we expect. But it will have bugs, especially as Tesla (and society in general) moves towards stochastic AI-based programming models (meaning, a - we don't know how the code actually works, b - its *stochastic*, which is a fancy word for probabilistic, which means it usually works - you can get close to working 100% of the time, but you can't quite get there). I really worry Tesla's current approach to handling these situations: "we looked at the telemetry, and it proves X". If there is a bug, then the telemetry can't be trusted. When Tesla says "look, the telemetry shows the driver wasn't steering", that just means thats what the car thought the driver was doing, not what was actually happening. When we, as a community, jump to defend Tesla, its understandable (we love that company!), but sometimes we should pause and consider that maybe there *is* something there.
So, all of us with Autopilot know that sometimes it swerves to the left or right. It is getting better, by a LOT, but still sometimes you see it swerve. I think the obvious responsibility rests on us, the drivers, that knowing that such swerves can happen, keep your damn hands on the wheel. Simple as that. And don't facetweet while driving either. Personally speaking, I sometimes disable autopilot and drive manually precisely because on certain roads, I don't have 100% confidence in AP's abilities. But then on some roads, especially well marked freeways, it's really awesome too.
The question isn't "could Tesla software contain a bug?" -- of course it could, and, as you suggest, almost certainly does. The question is: what is the probability of all of the numerous, separate bugs that would be required to explain the reported behavior manifesting simultaneously and only in this one driver's vehicle? It'd be one thing if you were saying "We know that Bug A has been observed under circumstance X, and Bug B under circumstance Y, so couldn't it be that this was some interaction of Bugs A and B in the unfortunate combination of circumstances X and Y?" But there are no such known "building blocks" from which to construct the necessary complex failure.
I am curious the timing between the car telling him to hold the steering wheel and the time of impact. Tesla was a bit ambiguous on this and they could have perfectly been truthful in what they said, while at the same time, there being only a second or two between that warning and the impact. This is why we keep our hands on the wheel of course.
Agree, timing is important. Even if your hands are on the wheel, and the car swerves, you may not have enough time at freeway speeds for your brain to instruct your muscles to take control.
In reading OP's letter it didn't sound like the driver was even remotely ready to handle any control over the vehicle. The whole I can't believe it didn't try to stop once it started hitting the wooden posts. If he was ready to take control at any moment he wouldn't have gotten to the point of knowing if autopilot would have stopped or not.