You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
How many fatal crashes involving non FSD cars? How many fatal crashes per million miles driven in both categories? Consider the source.
Number of accidents alone tell you nothing. You always have to compare the number of accidents to the total number of driven miles in order to measure how frequently the accidents occur. Tesla cars actually do billions of AP miles per year. 8 fatal crashes over billions of miles is actually a pretty safe track record. But I believe the allegation is that the crashes occurred when AP was used improperly. If that is true, then maybe a case could be made that Tesla should do better with when users are allowed to use AP in the first place by limiting what roads or what circumstances, AP is available.
Please do! You might want to double check my logic and math first. WaPo has been in the anti Musk/Tesla camp, publishing pseudo science misinformation. Calling them out is a good thing for the editors and their readers.@swedge Impressive analysis, laying bare the (frequent) bias present here, and many topics. May I plagiarize your text and post in their comment section ?
True. But help me with the various variations of AutoPilot. If I recall, Advanced AutoPilot is an extra cost option which includes stopping at stop signs and stoplights. If so, I'm guessing those "blew through the stopping" cases did not include this feature.NOT FSD! Nothing in the article mentions FSD, only autopilot. Autopilot has a warning that it is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” If it was being used in situations it was not intended for it's not surprising there were accidents.
Personally I always wondered why it wasn't disabled in situations where it was not supposed to be used.
Done. Thank you.Please do! You might want to double check my logic and math first. WaPo has been in the anti Musk/Tesla camp, publishing pseudo science misinformation. Calling them out is a good thing for the editors and their readers.
andThe driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:
a.“I was highly aware that was still my responsibility to operate the vehicle safely.”
The Post also failed to disclose that Autopilot restricted the vehicle's speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, "Cruise control will not brake."
The article said FSD was absent in some of the accidents, unknown in the others. They were suggesting that the Tesla software failed, because some of the cars might have had FSD, but they didn't know. The suggestion was that Autopilot should be geofenced to be unavailable where the owners manual says it is not designed for. The mention of FSD was to confuse the reader, a common misinformation tactic.It sounds like TSLA cherry picks data by only using AP not FSD.
WRONG. Its talking about autopilot. The article doesn't say FSD caused 8 fatal crashes.