brkaus
Well-Known Member
I wonder how the Mercedes E class would have behaved
Probably would have frequently nagged.
Or they would have found a coke can taped to the steering wheel.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I wonder how the Mercedes E class would have behaved
Inflammatory to whom? I'm not here to defend Tesla or to say things that are going to make you and others feel good. I'm here to express my opinion and to level some well deserved criticism at Tesla over its beta release methodology. I don't remember the last time Mercedes released a beta software of a safety or even convenience system. Do you? Other cars have auto steering, I haven't heard of any of them causing a crash or a death. Maybe that's because those manufacturers spend hundreds of millions of dollars in development and testing of such systems before they are put in the hands of consumers. I don't know, I'm just spitballing here.
Of course this was a failure of the truck driver, which is exactly WHY you buy a car with advanced safety features to protect you against something that you would never see coming. Model S has emergency braking which clearly did not work in this scenario. That's a problem. A defect. A design flaw. A boo boo. A deferred action item on some programmer's to-do list. Call it whatever you want, but it doesn't work like it should. If it did this crash could have been avoided and the driver might still be alive. The reason it did not function in this situation was due to an intentional decision on Tesla's part to ignore radar data points from taller objects due to false positives. Seems like that was clearly a mistake that the NHTSA will need to look at. Instead of ignoring potentially relevant data points, perhaps Tesla should have figured out how to properly deal with those data points before releasing an emergency safety feature that ignores data that it finds inconvenient.
You and I are probably not going to agree here. I've cut Tesla a ton of slack in the three years I've owned this car, but I'm on record several times saying that when it comes to safety issues, I cut Tesla ZERO slack. Somebody died here and Tesla's blog post attempts to absolve itself of any culpability by saying it is "beta" software and that the driver must be in control at all times. How convenient.
What you described above sounds like a design flaw, which points back to Tesla. A forward collision system should not depend on the ride height of the vehicle in front. It should be designed in such a way as to detect all vehicles and objects of a certain size, period.
Congrats. You get the award for the first insensitive d-bag in this tragic thread, more concerned about protecting the "brand".
...If NHTSA finds an issue, they can mandate a change. But legally, fault will ultimately lie with the driver of the truck/car...
Unfortunately this is an accident that was probably unavoidable by the best of drivers and one that would have otherwise been forgotten in the stream of thousands of other accidents that happen every day, except that it was a Tesla involved, and Autopilot happened to be on.
This is another reason I have no interest whatsoever in AutoPilot. I think as someone suggested, it provides a false sense of security for drivers who take it too seriously or tend to be inattentive. Without Autopilot that can't happen.
Were all of Joshua Brown's posts removed from TMC? I can't find anything of his, and I know I read the post where he talked about autopilot saving his life. Now I can't find it now matter how I search. Anybody know his TMC handle? Or have a link to the post I'm talking about?
I'm on record about AP. I hope some good comes of this tragedy, as far as Tesla and how they treat us (basically guinea pigs). I think Brown's family will get a lot of money from Tesla, as letting a civil suit go to court and losing (which I believe they surely would) would be a disaster. Tesla will settle it out of court (if they haven't already).
RIP Joshua Brown.
This is another reason I have no interest whatsoever in AutoPilot. I think as someone suggested, it provides a false sense of security for drivers who take it too seriously or tend to be inattentive. Without Autopilot that can't happen.
I'm on record about AP.
I hope some good comes of this tragedy, as far as Tesla and how they treat us (basically guinea pigs). I think Brown's family will get a lot of money from Tesla, as letting a civil suit go to court and losing (which I believe they surely would) would be a disaster.
It's a double-edged sword. There certainly is that side of it--relying too heavily on it and becoming inattentive. On the other hand, there are obviously many cases (several stories of which are on this forum) where the system AVOIDED serious accidents.
So when you say that can't happen, there are many cases where people not on autopilot go to fiddle with the radio, touchscreen, whatever--and ram into the car in front of them.
The big difference is that you don't always hear about the cases where the system saved lives--only where it took them. I for one am grateful to have "another pair of eyes" keeping track of the traffic in front of me.
On record about reading a book while using it and completely misunderstanding the system and its intended use? Yes, you are.
I hear this "what about the lives AP saves" argument a lot. Its false though.
Because you also have to consider how many times owners grab the wheel and take over because AP is about to cause an accident, as mine has done more than once.