That's what I'm saying.... time to C-M-O!You should definitely cancel. Or, you know, you could just not use the autopilot feature if you don't trust it...nah, cancel.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
That's what I'm saying.... time to C-M-O!You should definitely cancel. Or, you know, you could just not use the autopilot feature if you don't trust it...nah, cancel.
Huh?
A newborn baby has all its hardware, but not very functional.
A new PC with a fresh hard drive has all its hardware, but doesn't do anything but POST.
Processors are only as useful as the code running on them. If the AP2 HW needs more power, Tesla will upgrade the G/CPUs.
The AP software is a vehicle follower and a line follower. In the vehicle does something wrong, the only check is the lines. If the lines are also wrong there is no fallback. That is the way it works. People can process more (when paying attention) that's why AP is currently only an assist function.
"Tesla sped up before hitting road barrier"
"A navigation mistake by Autopilot..."
"the Tesla began a left steering movement."
The NTSB report and the coverage following it is really just awful investigating and awful reporting.
Because there is no mention at all of why these things happened.
Poorly maintained and missing lines caused the car to follow a line that led away from the lane.
All traffic aware cruise controls speed up to desired speed when car in front no longer is in the way.
There was no "mistake by Autopilot", there was only mistake by Driver.
I mean, NTSB, at least try to inform the public as to what is going on, and why these things are happening. There are reasons.
Ridiculous.
But regardless.... DRIVER NOT PAYING ATTENTION... should be the only headline.
Like @Ugliest1 said, there are no warning chevrons.The Tesla is now driving over cross hatches indicating 'do not drive here' to humans, but not to radar, and finally a crushed barrier.
With no time for a radar warning beep and no time to apply automatic brakes, there is no precrash braking.
All of the statements in the report are factual. Your logic however, is poor.
The fact that the driver made a mistake, or that the roads were poorly maintained/designed, do not excuse the mistake that AP made. Whether or not the driver is paying attention, do not make the autopilot mistake any more or less acceptable. The reason is not very important here. All mistakes happen for a reason, sometimes good reasons, but they're still mistakes.
The driver died because he had not corrected AP's mistake. If the driver had corrected the AP mistake, he would have lived. But AP still did make the mistake, and the driver's actions do not change this. See the logic?
And I should note: AP is designed to use in real roads with real-world conditions, real weather, real cars and real drivers right? All of the excuses you came up with to explain AP's problem are only all too common in the real world, including distracted drivers. There is no road ahead (pun intended) for EAP/FSD whatever if Tesla do not eventually solve these problems. And they need to do so fast before more people get impatient or die.
I am not surprised that the secondary media started to jump to a bunch of speculative conclusions
Exactly.
That is why a report like this has to be very specific. You just can't say "the car sped up into the wall" It makes it seems horrible when really it is a mundane answer that all traffic aware cruise control systems do. You can't just say "the car turned left" when really it is just the car following what line it had because the real one was horribly maintained and gone.
Just simply a very misleading report because they left out the truly important parts. The normal reasons why these mechanic things where happening.
I wonder if he froze up. I have a hard time understanding him not looking up when the car accelerated or when it made the “left steering input”. He hadn’t owned the car long enough to be THAT complacent with AP I would think.
All of the statements in the report are factual. Your logic however, is poor.
The fact that the driver made a mistake, or that the roads were poorly maintained/designed, do not excuse the mistake that AP made. Whether or not the driver is paying attention, do not make the autopilot mistake any more or less acceptable. The reason is not very important here. All mistakes happen for a reason, sometimes good reasons, but they're still mistakes.
The driver died because he had not corrected AP's mistake. If the driver had corrected the AP mistake, he would have lived. But AP still did make the mistake, and the driver's actions do not change this. See the logic?
And I should note: AP is designed to use in real roads with real-world conditions, real weather, real cars and real drivers right? All of the excuses you came up with to explain AP's problem are only all too common in the real world, including distracted drivers. There is no road ahead (pun intended) for EAP/FSD whatever if Tesla do not eventually solve these problems. And they need to do so fast before more people get impatient or die.
this makes me actually want to cancel my model x reservation. they should disable autopilot on all cars until they can find out the definitive cause
I wonder if he froze up. I have a hard time understanding him not looking up when the car accelerated or when it made the “left steering input”. He hadn’t owned the car long enough to be THAT complacent with AP I would think.
By the time the car started accelerating he was only 4 seconds from the barrier. The sun was also in his eyes, which may have been a factor. Maybe he saw the barrier too late to not hit it, but froze up instead of acting. Some people do freeze in situations like that, and you don’t always know if you are one of them until you are in that situation.
Yes, people on this forum are all too quick to point fingers at the victims of these AP related crashes.
But in pretty much all cases so far, there is nothing else to point to.
They are responsible for driving.
When they are not responsibly driving (paying attention), they are responsible for the outcome.
Anything beyond this is just pointing fingers at the machine for the human error.
It's really simple. You are driving. Pay attention and drive, no matter what assistance tech you use.
That said, FCW warning (it does not brake) could be cranked up, however too many false warnings would lead to people disabling it resulting in less net safety...
The issue is the driver is still the only one responsible. Accidents are tragic, especially those that result in injury or fatality. If your attention is on the road and you're being attentive to what autopilot is doing, you can override mistakes that autopilot makes. It doesn't fight you for control, although the longer you spend with your attention off the task at hand, the smaller window you have to correct.Yes, people on this forum are all too quick to point fingers at the victims of these AP related crashes. I'm not trying to say that these drivers weren't in some ways culpable, or that they shouldn't take responsibility of managing AP correctly. But in the real world, the circumstances are rarely black and white. They may have had good reasons for their problems, like you wrote. AP can be very unforgiving when it makes it own mistakes and bad things happen as a result.
People here could learn some empathy for their fellow drivers to understand that to err is human. Have the most vocal critics of drivers never been in an accident themselves? A system that is "user-friendly," especially an assistive system that requires human interaction, need to be designed for the average driver, with all their flaws in mind. A system that invites user error or is unforgiving when it makes its own mistake (not handling the gore point correctly) is a deficient design. Distracted drivers are not just a problem for those drivers, it's a problem for Tesla as well! People should realize that and try not to find excuses to explain away every flaw in Tesla cars.
There is human error here, no doubt. But you're ignoring the machine error. People should rightfully be pointing fingers at the machine for the error too.
Don't just wave away the machine error by saying it's part of the design specs. If an obvious mistake is "designed," then it's a bad design. This is the difference between a bad product and a good product. Both might get the job done, but why would anyone buy the bad one?
Don't confuse what you want AP to be with what AP currently is.
I have a truck that has a feature that will keep it at the same speed regardless of road condition or obstacles. Is that defective? Or it is designed to only handle control of speed? Is that limited scope a defect?
AP will try to avoid leaving the lane (better than my truck) and to not rear end cars ahead (better than my truck). That is what it is designed to do. It is not designed to be autonomous, but it's a darn sight better than my truck.
If you feel AP is defective, then you should also feel all cars with only cruise control are defective because they are designed with no regard at all for the ability to ram objects/ cars/ pedestrians...
Further, you should feel all non AEB cars are defective due to no design consideration for collision avoidance.
AP does not cover the entire use case environment, but it covers more than most. The fact is does not have 100% coverage does not mean it is worse than cars with less coverage...