Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Still Waiting for Elon's Blog Post on Autopilot Update...

This site may earn commission on affiliate links.
I'd disagree... if they want to avoid future lawsuits they absolutely had to do it. You don't get to call something Auto Pilot (short for AUTOMATIC Pilot), and then not have it act in a way that could not be described as "Automatic". Especially when the consequences involve damage to property and death.

As they said in the call, they aren't expecting to avoid lawsuits. They can't and push this technology forward. I think that's a cynical view of what's driving technological innovation.

I don't see any of the other automakers, some of which have been far more blatant in their "self-driving car" marketing, bending over backward to update their (far less capable in some instances) systems.
 
I'd disagree... if they want to avoid future lawsuits they absolutely had to do it. You don't get to call something Auto Pilot (short for AUTOMATIC Pilot), and then not have it act in a way that could not be described as "Automatic". Especially when the consequences involve damage to property and death. Now if they had just called it "Intelligent Driver Assist", I'd buy your hypothesis that they could not improve it and be in the clear.
/QUOTE]

We all know what automatic means, but what does pilot mean when operating a car? Just read the manual to find out.
 
Here's the transcript part where one of the journalist asked Musk about the AP issue with holding the steering wheel...and Musk completely ignores the issue....

David Shepardson – Reuters

Thank you. You mentioned some of the changes you are making in term of preventing – require people to place their hands on the steering wheel. Can you tell us how it is changing in term of how many seconds you will be allowed hands off the wheel versus the current one and will this satisfy NHTSA or do you think you will have to do this upgrade as part of a formal recall?

Elon Musk – Tesla CEO

The word “recall” doesn’t make sense in this context since this is an over-the-air update. “Recall” is for companies where the cars require plug in something into the car. It’s not a term that really make sense in this situation. We have done most of these changes with NHTSA and I don’t want to speak for them but they appear to be pretty happy with the changes and the reactions from them is quite positive.
 
Part 6 transcript up:
Transcript: Elon Musk’s press conference about Tesla Autopilot under v8.0 update [Part 6]

I found this bit interesting. I had a theory that 8.0 might end up being a mandatory update if supercharging idle fees ended up getting implemented to get everyone onboard at once. This comment about this update satisfying the NHTSA (but yet not being a recall) makes me wonder if the update will be optional.

David Shepardson – Reuters
Thank you. You mentioned some of the changes you are making in term of preventing – require people to place their hands on the steering wheel. Can you tell us how it is changing in term of how many seconds you will be allowed hands off the wheel versus the current one and will this satisfy NHTSA or do you think you will have to do this upgrade as part of a formal recall?

Elon Musk – Tesla CEO
The word “recall” doesn’t make sense in this context since this is an over-the-air update. “Recall” is for companies where the cars require plug in something into the car. It’s not a term that really make sense in this situation. We have done most of these changes with NHTSA and I don’t want to speak for them but they appear to be pretty happy with the changes and the reactions from them is quite positive.

NHTSA is very serious about this and you are correct to have noticed that. It is my opinion there will be a recall of Tesla's TACC and AEB systems. These changes are likely an attempt to avoid the official recall. We will see what the folks at NHTSA do but they are well aware of the problems with Tesla's TACC and AEB implementation as well as Tesla's predilection for blaming their consumer's for any and all accidents which occur during the operation of their systems with known defects/limitations.
 
I had a theory that 8.0 might end up being a mandatory update if supercharging idle fees ended up getting implemented to get everyone onboard at once. This comment about this update satisfying the NHTSA (but yet not being a recall) makes me wonder if the update will be optional.
I think you and I have been on the same page about this (and other things) for a while now. This would be an interesting way to push it, though I tend to believe that Tesla will want to encourage adoption via disincentive (disabling Supercharging for < 8.0, as discussed). I think that provides enough encouragement to get nearly full fleet adoption at this stage. But this is definitely in the realm of wild speculation.
 
  • Disagree
  • Like
Reactions: green1 and Az_Rael
Today, with my pre-AP P85, I routinely knee-drive with coffee in one hand and a bagel in the other when conditions permit. It would absolutely be safer to do this with AP than without, but sounds like 8.0 will force me to do the less safe thing. For safety.

Absolutely terrifying. Obviously, you haven't died, or killed anybody by doing this, so far, but, wow. I can't imagine which conditions these would be, outside of "my driveway at 5mph".

Sorry for being off topic.
 
  • Like
Reactions: 1208
What does ignoring mean in this context? Not that I expect anyone to know/have a definite answer, but these are the 2 scenarios I can think of/make sense to me:

1. The car beeps (after visual warning), user doesn't nudge the wheel, second beep comes up --> counter = counter + 1?

2.the car beeps (after visual warning) --> counter = counter + 1 ?

What they could do is have a "countdown" or "paying-attention-o-meter" type of display. Starts at 0, and if it detects you holding the wheel in a way it likes, it starts going positive, into green, and builds up a cushion. As you let go of the wheel (by the software's definition of "letting go"), the bar starts getting shorter, gets to zero, starts going negative, turns yellow, orange, gives you a warning, red, etc.

So, there is a bit of a countdown, as well as it trains you in the "Tesla's autopilot wheel holding technique".
 
  • Disagree
Reactions: green1
What they could do is have a "countdown" or "paying-attention-o-meter" type of display. Starts at 0, and if it detects you holding the wheel in a way it likes, it starts going positive, into green, and builds up a cushion. As you let go of the wheel (by the software's definition of "letting go"), the bar starts getting shorter, gets to zero, starts going negative, turns yellow, orange, gives you a warning, red, etc.

So, there is a bit of a countdown, as well as it trains you in the "Tesla's autopilot wheel holding technique".
One more thing to pay attention to other than the road! ;)
 
  • Like
Reactions: green1
Citation please?

Any and all systems have limitations. Disclosing such and expecting that people act accordingly seems the responsible thing to do?

Actually, we should be fair here. Tesla has made no reasonable effort to educate the current owners of the specific limitations and known defects in their TACC, AEB, and auto steer systems.

I was given no direction or education upon delivery of the autosteer, TACC, or AEB systems other than to accept the limitation of liability statements regarding the beta software. More time and effort was spent explaining the trunk to me than the autopilot system. Yet, there is a random list buried in the middle of the manual which details substantial limitations in the system.

Tesla has no problem sending a large number of sales emails, adding a button to the app for referrals, and a number of other communications regarding sales and marketing. There has not been a single communication despite a NHTSA investigation, a fatal crash, multiple episodes of autopilot running people off the road, people believing they were in autopilot but were not in autopilot and being injured in an accident, as well as minor crashes involving the self parking system.

From a very serious 'minimal effort required' perspective it would really be a terribly small burden on Tesla to make more of an effort than burying legal disclaimers in their manual to educate their user's on a system which is at worst the actual or proximate cause of death and dissibility and at best a contributing factor. They did know or should have known that MobilEye's system had a limitation in which it could not identify laterally crossing traffic yet made no effort to education it's consumers about that particular limitation.

In multiple news reports and open letters Tesla has claimed they are contacting people after accidents but these claims have been refuted by everyone but Tesla. Additionally, Tesla has public stated they are "always looking for ways to encourage their customers to properly use their semi autonomous features" except that they have made no such effort to either survey, poll, or question their consumer's on what information would be helpful to them and has, in fact, ignored repeated requests for better education on the system's best practices and limitations. While there is currently hours and hours of introductory video explaining other aspects of Tesla's systems there is no comparibly in depth or serious effort made to properly educate on its TACC, AEB, or autosteer systems.

This, in addition to the fact that Tesla is self insured and has no money is an incredibly concerning thing for someone like myself who owns a Tesla that has had multiple problems.

Before I get attacked by everyone on the planet I would like to make clear that my intent here is to share my experience and push Tesla into better business and safety practices. I know the pain and trauma associated with vehicle accidents. Too many lives are lost too early and I am sick and tired of witnessing this pain, experiencing this pain, and doing nothing about it.

Tesla is in a unique position to solve this problem and I believe has the talent to do it. They need to do it carefully, transparently, and with the consent and education of their consumers to avoid major setbacks and disasters. I love Tesla and what it stands for. It needs to do this the right way because they have the ability cause such drastic change in the world as modern medicine has. The number of 'quality life years lost' in auto accidents is staggering. Almost all of these are completely preventable.
 
Last edited:
Actually, we should be fair here. Tesla has made no reasonable effort to educate the current owners of the specific limitations and known defects in their TACC, AEB, and auto steer systems.

I was given no direction or education upon delivery of the autosteer, TACC, or AEB systems other than to accept the limitation of liability statements regarding the beta software. More time and effort was spent explaining the trunk to me than the autopilot system. Yet, there is a random list buried in the middle of the manual which details substantial limitations in the system.

It appears one problem may be that you are are expecting a user training session, rather than actually reading the manual given.

The Tesla Model S Manual contains no less than 26 pages devoted to Driver Assistance. It is not buried, but rather a main section, with the same prominence as Driving, Charging, The Touchscreen, etc...

In that section I count 59 warning icons with associated specifics, and 9 sub-sections entitled "Limitations", each with one-to-several paragraphs outlining the system limitations.

I find your assertion that "Tesla has made no reasonable effort to educate the current owners of the specific limitations..." to be demonstrably false.

Tesla has no problem sending a large number of sales emails, adding a button to the app for referrals, and a number of other communications regarding sales and marketing. There has not been a single communication despite a NHTSA investigation, a fatal crash, multiple episodes of autopilot running people off the road, people believing they were in autopilot but were not in autopilot and being injured in an accident, as well as minor crashes involving the self parking system.

There is in fact a post on Tesla's official update page regarding the fatal accident.

I find your assertion that ". There has not been a single communication despite a NHTSA investigation, a fatal crash,..." to be demonstrably false.

From a very serious 'minimal effort required' perspective it would really be a terribly small burden on Tesla to make more of an effort than burying legal disclaimers in their manual to educate their user's on a system which is at worst the actual or proximate cause of death and dissibility and at best a contributing factor. They did know or should have known that MobilEye's system had a limitation in which it could not identify laterally crossing traffic yet made no effort to education it's consumers about that particular limitation.

Your characterization of the manual only containing "legal disclaimers" is also incorrect. They are actual usage instructions and specify the limitations of the system, instructing the driver to be prepared to take over. (see my references above)

I'd be interested if you have an example of either another vehicle manufacturer employing MobilEye technology, or MobilEye themselves, outlining every possible specific failure scenario.


In multiple news reports and open letters Tesla has claimed they are contacting people after accidents but these claims have been refuted by everyone but Tesla.

Citation, please.


Additionally, Tesla has ... in fact, ignored repeated requests for better education on the system's best practices and limitations.

Citation please.
 
It appears one problem may be that you are are expecting a user training session, rather than actually reading the manual given.

The Tesla Model S Manual contains no less than 26 pages devoted to Driver Assistance. It is not buried, but rather a main section, with the same prominence as Driving, Charging, The Touchscreen, etc...

In that section I count 59 warning icons with associated specifics, and 9 sub-sections entitled "Limitations", each with one-to-several paragraphs outlining the system limitations.

I find your assertion that "Tesla has made no reasonable effort to educate the current owners of the specific limitations..." to be demonstrably false.



There is in fact a post on Tesla's official update page regarding the fatal accident.

I find your assertion that ". There has not been a single communication despite a NHTSA investigation, a fatal crash,..." to be demonstrably false.



Your characterization of the manual only containing "legal disclaimers" is also incorrect. They are actual usage instructions and specify the limitations of the system, instructing the driver to be prepared to take over. (see my references above)

I'd be interested if you have an example of either another vehicle manufacturer employing MobilEye technology, or MobilEye themselves, outlining every possible specific failure scenario.




Citation, please.




Citation please.
It appears one problem may be that you are are expecting a user training session, rather than actually reading the manual given.

The Tesla Model S Manual contains no less than 26 pages devoted to Driver Assistance. It is not buried, but rather a main section, with the same prominence as Driving, Charging, The Touchscreen, etc...

In that section I count 59 warning icons with associated specifics, and 9 sub-sections entitled "Limitations", each with one-to-several paragraphs outlining the system limitations.

I find your assertion that "Tesla has made no reasonable effort to educate the current owners of the specific limitations..." to be demonstrably false.



There is in fact a post on Tesla's official update page regarding the fatal accident.

I find your assertion that ". There has not been a single communication despite a NHTSA investigation, a fatal crash,..." to be demonstrably false.



Your characterization of the manual only containing "legal disclaimers" is also incorrect. They are actual usage instructions and specify the limitations of the system, instructing the driver to be prepared to take over. (see my references above)

I'd be interested if you have an example of either another vehicle manufacturer employing MobilEye technology, or MobilEye themselves, outlining every possible specific failure scenario.




Citation, please.




Citation please.

I think we both know that both of us have a background here. ;-) I am not on this forum to litigate. I am sharing my opinion. I appreciate your requests and opinion. I respectfully decline to do a substantial amount of research to build a list of citations, news articles, and prove my point. I could, it would be interesting, but a waste of my time.

But I would like to sincerely say that I understand what you are requesting here and you have a right to. I don't begrudge that and you are arguing the 'other' side. I think that is great and you should share your opinion as well!
 
  • Disagree
Reactions: efusco
I'm not reading the Tesla Manual to define the word PILOT?!?! The DICTIONARY is the source of truth for the definition of the word PILOT. The MANUAL is and instruction manual for a car.

Look, I'm a fan too. I get it. But at the end of the day even Tesla is accountable to shareholders and investors. EM doesn't get to redefine words to suit him.

He's not redefining anything. The way that it works IS remarkably like an aircraft autopilot. In aircraft, they're pretty dumb devices - fly straight and level, turn XX degrees at specific position. They don't automatically select and fly routes and avoid hazards - other aircraft, mountains or bad weather. And pilots cannot turn it on and take a snooze. The #1 thing that you're taught about autopilots, as a new pilot, is to not trust them. They can and will fail. I've had it happen to me. Trusting them too much can absolutely kill you.

The problem is that there is widespread ignorance of what an autopilot really is. People imagine K.I.T.T. from Knightrider.
 
Last edited:
He's not redefining anything. The way that it works IS remarkably like an aircraft autopilot. In aircraft, they're pretty dumb devices - fly straight and level, turn XX degrees at specific position. They don't automatically select and fly routes and avoid hazards - other aircraft, mountains or bad weather. And pilots cannot turn it on and take a snooze. The #1 thing that you're taught as a new pilot is: don't trust them. They can and will fail. And I've had it happen to me. Trusting them too much can absolutely kill you.

The problem is that there is widespread ignorance of what an autopilot really is. People imagine K.I.T.T. from Knightrider.

I agree that is what Autopilot does generally, but I would point to the behavior over crests as very different than what the autopilot analogy would lead you to expect. The vehicle loses lane marking indicators over the crest...the aircraft analogy would dictate it to continue straight perhaps with some warning that it no longer can see/steer accurately. Unfortunately, what the car does most often is to steer desperately for the shoulder or on-coming lane while trying to find the lane markers. It's not clear what the defect is, but it certainly doesn't match the behavior that would be expected if you told someone "it's just like autopilot on an aircraft." If I am explaining the system to a new user, I get that special case front and center, because it could be deadly if misunderstood.
 
  • Like
Reactions: Soolim