1. Disagree: Tesla has not had mixed messages on current capability. The media has.
This implies you would agree that Tesla has not been the most clear and explicit on its AP capabilities and inherent dangers in previous hw/sw iterations?
Historically and even today I think they have made and continue to make a comparatively poor job of this. E.g. if I lend out my car, nothing in it will actively warn the unfamiliar driver of the several specific treacheries of AP while he is using it, easily lulling him into a potentially fatal overestimation of its capabilities. This lacuna contributed to how the fatality in China occurred, and it is reported that Tesla's Chinese sales language describing the car as "self-driving" was, subsequent to the ensuing lawsuit, severely toned down:
https://jalopnik.com/two-years-on-a-father-is-still-fighting-tesla-over-aut-1823189786
AP nag times have also been continually shortened, and not for no reason.
2. Disagree: first fatality was not a firetruck
Correct, it was into a slow-moving street-sweeper on the left lane of a Chinese motorway, 300m after a leading vehicle had moved aside, but that case was for all practical purposes identical to Tesla's user manual description of its radar inadequacy, "
Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead." which is what I for shorthand call
Firetruck Super-Destruction mode, in mocking reference to the much-vaunted FSD [full self-driving] this same fatally inadequate hardware is somehow eventually supposed to support.
3.
Disagree: no such mode
Yes, that's more sarcasm, I'm afraid, to accentuate the fact that Tesla has effectively declared, by its now 3 year-old refusal to fix the problem, that
Firetruck Super-Destruction mode "
Is not a bug, it's a feature!". Neither chiding the bereaved/maimed with Tweets of "
Yeah, you're holding it wrong!" nor slathering them with a mess of unverifiable statistics about how tremendously safe AP actually is, are an effective solution.
4.
Disagree that reason the "stationary object after occluding vehicle changes lane" issue has not been solved is the lack of lawsuit.
Why then do you think Musk/Tesla has not been supremely motivated to solve [or even much discuss] this pressing embarrassment in the past 3 years?
I for one can't avoid thinking that the lack of any appreciable consequences, legal or in sales, is the major factor. Hence it has been allowed to slide in the hopes that the magic of AI will, at some point in the glowing promised future, cure all ills.
5. Disagree that it is fundamentally unsafe. If it is, cars with zero lane keeping are super-duper-fundamentally unsafe.
Lane-keeping is not really the issue and whattaboutery on the failings of other manufacturers' systems is no consolation either.
The major problem is that Tesla vehicles on Autopilot can fail to slow down or even squeak an audible warning when pile-driving you at 80mph into a stationary traffic jam on the highway. The system
is fundamentally unsafe because at >50mph the current radar hardware apparently cannot distinguish between the arse end of an artic stopped in your lane and the iron guardrails along the roadside, so this data is simply ignored whereas processing of the camera data to detect/prevent the hazard is yet to be implemented or made operational. Hence a literal couple of seconds' inattention at precisely the wrong moment can equal an instant death sentence.
How can
safe Highway L3 AP (never mind FSD) be supported on this hardware? IMHO it is impossible.
Even ardent Tesla fans have no problem admitting there is a severe disconnect between the CEO's hype and reality:
Elon Promises Safety Upgrades After Model 3 Suffers Severe Crash Using Autopilot | CleanTechnica
6.
Disagree that the Atari features are lipstick on a pig.
It is at least a distraction from much more important outstanding work which has not been done in the last 3 years, as outlined above.
7.
Disagree: The lawsuit is frivolous. Driver was not paying attention, not following the owners manual, and speeding by 10-15 MPH. Not the car's fault, not Tesla's fault.
Whereas I agree this driver committed numerous faults, Tesla cannot simply dismiss its responsibility to
recognise human nature and eliminate design defects which tend to lull into over-reliance before these contribute to further unnecessary deaths.
To that end here are a couple of recent scientific studies on the difficulties of maintaining driver engagement while using partially-automated driver assistance systems such as, and including, Tesla's AP:
https://www.researchgate.net/public..._bad_idea_Observations_from_an_on-road_study?
The Challenges of Partially Automated Driving
It arguably becomes at least partially Tesla's fault when they continue to ignore valid empirical evidence that their AP system, when combined with the average human, is in certain circumstances less safe than the same driver operating the vehicle without its alleged "assistance". The time for skating around liability by the skin of Musk's teeth is, I feel, soon coming to an end and that will actually be a good thing for us all.
However, if prudently managed, Tesla would pre-empt this looming development by designing a safer system for the interim and probably prolonged period it will realistically require to reach higher levels of autonomy [ e.g. by implementing active driver-focus monitoring via facial analysis, like Cadillac's Super Cruise
] and retrofitting that to its current vehicles before being forced to do so by regulators/law.