Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Still Waiting for Elon's Blog Post on Autopilot Update...

This site may earn commission on affiliate links.
The problem is that there is widespread ignorance of what an autopilot really is. People imagine K.I.T.T. from Knightrider.
Correct, and that's the arguable position. Their choice of that term was poor because of the general misunderstanding of that word. I don't think @BobinBoulder approached the argument from a defensible position, but pivoting a little and the argument becomes reasonable. At least in my opinion.
 
I agree that is what Autopilot does generally, but I would point to the behavior over crests as very different than what the autopilot analogy would lead you to expect. The vehicle loses lane marking indicators over the crest...the aircraft analogy would dictate it to continue straight perhaps with some warning that it no longer can see/steer accurately. Unfortunately, what the car does most often is to steer desperately for the shoulder or on-coming lane while trying to find the lane markers. It's not clear what the defect is, but it certainly doesn't match the behavior that would be expected if you told someone "it's just like autopilot on an aircraft." If I am explaining the system to a new user, I get that special case front and center, because it could be deadly if misunderstood.

That's not a behaviour I've seen a lot of on my car. But I generally only use my AP on highways and the crests tend to be pretty gentle. In any case, aircraft autopilots do strange, wonderful and unexpected things as well. Like with the Tesla autopilot you have to understand the limitations of the hardware and use it appropriately. And #1, don't trust it. As you get more accustomed to the technology, you start to realize where and why it fails and learn to accommodate for it.

If Tesla failed in anything, it's that they've leaned toward giving people too much credit.
 
He's not redefining anything. The way that it works IS remarkably like an aircraft autopilot. In aircraft, they're pretty dumb devices - fly straight and level, turn XX degrees at specific position. They don't automatically select and fly routes and avoid hazards - other aircraft, mountains or bad weather. And pilots cannot turn it on and take a snooze. The #1 thing that you're taught about autopilots, as a new pilot, is to not trust them. They can and will fail. I've had it happen to me. Trusting them too much can absolutely kill you.

The problem is that there is widespread ignorance of what an autopilot really is. People imagine K.I.T.T. from Knightrider.


But you touched on a key point in your response.... "The way that it works IS remarkably like an aircraft autopilot" and "The #1 thing that you're taught about autopilots, as a new pilot...". Who "trained" you to use autopilot in your Tesla? Who explained the difference between what "pilots are taught" and standard SOP's in the aviation field, and the layman who envisions airplane pilots engaging autopilot then eating dinner, flirting with the Purser, and taking naps?...

My point is this.. whether true or not, the phrase "Auto Pilot" for 99% of the planet means "This thing drives itself". And it is irresponsible to think otherwise. To blame someone for not dissecting the meaning of the word "Pilot" in the context of the Tesla manual, or for not understanding the nuances of "Auto Pilot" in the aviation world and blaming the user for this is just plain dumb. If one has to make up this many excuses for mis interpretation and mis understanding of the term, then the TERM ISN'T CLEAR.

And again, I'm a fan. I'm not blaming anyone. But Tesla is a for profit company. They are not creating anything for the good of humanity (although that may be a nobel and intentional byproduct). And my original assertion that this update was about improving a system that badly needed to be improved. It wasn't a "gift from Elon". And by the way... I still want a new UI. :)
 
  • Like
Reactions: davidc18
="BobinBoulder, post: 1727330, member: 50026 "<snip>
Who "trained" you to use autopilot in your Tesla? <snip>
I don't recall getting trained to use standard/old_fashion cruise control that just goes one speed and would bump into others if I let it.
Autopilot text I agreed to when enable said for me to keep my hands on the wheel and that I was responsible for driving it.
And the types of roads and conditions it was intended for. Then when I enable it it reminds me to keep my hands on my steering wheel.
 
Last edited:
But Tesla is a for profit company. They are not creating anything for the good of humanity (although that may be a nobel and intentional byproduct).

I think you are wrong on both counts.

But, your point that the limits of the system were not thoroughly explained is acknowledged. Honestly, like a lot of things Tesla, I would bet the operation was checked very well in Elon's neighborhood and commute, and how it worked in some locale 1000 miles away was going to be learned by the beta testers (us). I appreciate the opportunity.
 
I think you are wrong on both counts.

But, your point that the limits of the system were not thoroughly explained is acknowledged. Honestly, like a lot of things Tesla, I would bet the operation was checked very well in Elon's neighborhood and commute, and how it worked in some locale 1000 miles away was going to be learned by the beta testers (us). I appreciate the opportunity.

Wait... I'm confused.... you think that Tesla is a non-profit, and they are only interested in manufacturing the Tesla in order to save humanity? Really?
 
But you touched on a key point in your response.... "The way that it works IS remarkably like an aircraft autopilot" and "The #1 thing that you're taught about autopilots, as a new pilot...". Who "trained" you to use autopilot in your Tesla? Who explained the difference between what "pilots are taught" and standard SOP's in the aviation field, and the layman who envisions airplane pilots engaging autopilot then eating dinner, flirting with the Purser, and taking naps?...

My point is this.. whether true or not, the phrase "Auto Pilot" for 99% of the planet means "This thing drives itself". And it is irresponsible to think otherwise. To blame someone for not dissecting the meaning of the word "Pilot" in the context of the Tesla manual, or for not understanding the nuances of "Auto Pilot" in the aviation world and blaming the user for this is just plain dumb. If one has to make up this many excuses for mis interpretation and mis understanding of the term, then the TERM ISN'T CLEAR.

And again, I'm a fan. I'm not blaming anyone. But Tesla is a for profit company. They are not creating anything for the good of humanity (although that may be a nobel and intentional byproduct). And my original assertion that this update was about improving a system that badly needed to be improved. It wasn't a "gift from Elon". And by the way... I still want a new UI. :)

I'll concede that a lot of people don't understand what autopilots actually do. And I'll also concede that, despite being entirely correct in applying the name, Tesla should have considered that people don't know. With that said, Tesla has been very clear on the product limitations in their manual, in the disclaimer that you agree to on activation and in various other publications. Heck they even call the product "beta". I don't know what more they can do.

Driving ANY vehicle is about understanding the operation of the product, it's capabilities and limitations. If somebody is so wilfully ignorant that they don't know to check the oil, despite it being in the owners manual, then should they be able to blame the manufacturer when they cook the engine? Tesla decided to treat people like adults. Sadly this turned out to be a bad decision.
 
  • Like
Reactions: green1
[QUOTE="BobinBoulder, post: 1727330, member: 50026"<snip>
Who "trained" you to use autopilot in your Tesla? <snip>
I don't recall getting trained to use standard/old_fashion cruise control the just goes one speed and would bump into others if I let it.
Autopilot text I agreed to when enable said for me to keep my hands on the wheel and that I was responsible for driving it.
And the types of roads and conditions it was intended for. Then when I enable it it reminds me to keep my hands on my steering wheel.[/QUOTE]
False equivalents. Cruise control is a VERY old technology. That's like claiming no one ever trained you to use a pencil, you just figured it out yourself. This is AUTO-Pilot - (Again, stressing AUTOMATIC Piloting of a PERSONAL VEHICLE). Even the argument that aircraft auto-pilot isn't fully automatic falls apart when you consider that most Tesla drivers are not pilots and most most pilots are not Tesla drivers (please, no one ask me to prove that assertion... that's just basic statistics).

And if the disclaimer of enabling the system should be considered "training", then so is the disclaimer that I click through to use my iphone and every other piece of software sold in the past 20 years. - again... a very false equivalent.
 
  • Disagree
  • Like
Reactions: davidc18 and green1
I don't recall getting trained to use standard/old_fashion cruise control the just goes one speed and would bump into others if I let it.
Autopilot text I agreed to when enable said for me to keep my hands on the wheel and that I was responsible for driving it.
And the types of roads and conditions it was intended for. Then when I enable it it reminds me to keep my hands on my steering wheel.

I agree. "I wasn't trained!" is too often just a way to slink out of liability for one's own actions.
 
And if the disclaimer of enabling the system should be considered "training", then so is the disclaimer that I click through to use my iphone and every other piece of software sold in the past 20 years. - again... a very false equivalent.

I would insert an eye-rolling emoticon but I haven't been trained in how to use this website and therefore can't. :rolleyes: Oh wait... there it is. I figured it out on my own.
 
Part 7 transcript is out
Transcript: Elon Musk’s press conference about Tesla Autopilot under v8.0 update [Part 7]

More info on the warnings:

Joe White – Reuters

Good thanks. I’m sorry I have to retrace steps for my question and see if I can clarify this with you. If I am in a Model X, how many minutes or seconds may I take my hands off the wheel and how many times an hour may I do that? I mean how long are now going to let me take my hands off the wheel versus the previous generation of the system?

Elon Musk – Tesla CEO

It actually depends on fast you are going. If you are going in very slow stop-and-go traffic – I believe the threshold is about 8 miles an hour – you can actually take your hands off the steering wheel for indefinite period of times. This is at times where you are basically at walking speed on average on the freeway. there’s no limit on that and I don’t think there should be. That’s also the regulatory limit for automatic parallel parking.

And this is a complicated answer. It’s not as simple as “2 minutes” or something like that. If you are below 45 mph, in theory, the longest you could go is about 5 minutes, but there are actually hands on wheels requirement positions that detects lateral acceleration above a certain threshold. So you would have to be a very straight road – below 45 mph to last 5 minutes.

And then if you are above 45 mph – and again this is a complicated answer and I don’t know how much of this you can put in an article [laughing] – it’s one minute if you don’t have a car to follow. It’s 3 minutes if you do have a car to follow because the accuracy is greater if your follow than if you don’t.

I think that the thing that will probably be most effective is the limit for expert users, which is where we tend to see actually the biggest issue. It’s not with the new users. The new users of Autopilot are incredibly attentive. They pay attention very closely. Intermediate users, same thing. It’s actually the people who know it best, ironically, where we see some of the biggest challenges.

The limitation of only 3 audible warnings per hour [laughing] which is a fair number of warnings, but we see people engaging in reflex actions where they will hear a warning every 3 minutes and they will just touch the steering wheel but not actually pay attention to the road. I think that will be most effective in addressing the instinctive “I want the beep to go away” and touch the steering wheel, it will only allow people to do that 3 times in an hour.

We are also going to provide a visual indicator where the perimeter of the instrument panel lights up with an increasing pulse rate before giving you the audible warning. So that the visual warning is a reminder to pay attention to the road before you get the audible warning. I beta tested – true beta test, alpha test really – the software personally. I feel strongly in using myself and make sure it’s good before anyone else uses it. I used it on an alpha basis to confirm and it’s good.

I really feel like we’ve struck a great balance between both improving the safety and the usefulness – and the comfort level of the system, and it’s very difficult to do both.

Obviously, you could hamstring the whole system and therefore reduce the action on Autopilot, and it becomes useless and painful to use. Or you can loosen all those limits and have more accidents. So it’s a very difficult thing to both improve the safety and improve the utility of the system which I think we have achieved.
 
Last edited:
Ok. this is annoying but I will do it in the interest of public information. Here we go....

It appears one problem may be that you are are expecting a user training session, rather than actually reading the manual given.

I am not claiming that 'I want' anything of the sorts. I am claiming that from a safety standpoint Tesla should put forth at least the same effort to educate customers on new technologies with known limitations as they do on the operation of the key.


The Tesla Model S Manual contains no less than 26 pages devoted to Driver Assistance. It is not buried, but rather a main section, with the same prominence as Driving, Charging, The Touchscreen, etc...

In that section I count 59 warning icons with associated specifics, and 9 sub-sections entitled "Limitations", each with one-to-several paragraphs outlining the system limitations.

I find your assertion that "Tesla has made no reasonable effort to educate the current owners of the specific limitations..." to be demonstrably false.

Although the manual is a very reasonable educational method it does not pose any substantial burden on them to expect that they provide more than what is in the manual.

There is in fact a post on Tesla's official update page regarding the fatal accident.

I find your assertion that ". There has not been a single communication despite a NHTSA investigation, a fatal crash,..." to be demonstrably false.

This post is not in good faith a message to Tesla owners or operators. It is obviously a message to investors. This post was also not disseminated to Tesla owners. It is nothing more than a small headline in a section called "Updates." They could have easily sent a notice to their customers with a brief list of recommendations. They did not.

Additionally, there is this quote:
"Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

I want to know, why haven't they shared this data?


Your characterization of the manual only containing "legal disclaimers" is also incorrect. They are actual usage instructions and specify the limitations of the system, instructing the driver to be prepared to take over. (see my references above)

I'd be interested if you have an example of either another vehicle manufacturer employing MobilEye technology, or MobilEye themselves, outlining every possible specific failure scenario.

'"We have read the account of what happened in this case. Today's collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020."' - Dan Galves, CEO of MobilEye

Mobileye (MBLY) Issues Statement on Fatal Tesla (TSLA) Model S Autopilot Crash

"In multiple news reports and open letters Tesla has claimed they are contacting people after accidents but these claims have been refuted by everyone but Tesla."

Citation, please.

montana crash we all know about with mr pang.
Driver of Model X crash in Montana pens open letter to Musk, calls Tesla drivers "lab rats" [Updated]

pensylvania case where man had autopilot activated, it the aborted, and he crashed as a result. also said tesla did not contact him like they said.
Tesla says autopilot was not on during Pennsylvania crash

consumer reports 'experts' report detailing the same issue I have described
Tesla's Autopilot: Too Much Autonomy Too Soon

texas case: 'guy also said tesla did not contact him like they said
Chubb Division May Sue Over Tesla Autopilot Crash - Carrier Management

open letter regarding brakes issues:
OPEN LETTER TO MR MUSK | EV And More

Screen_Shot_2016_09_13_at_6_09_54_PM.png
Screen_Shot_2016_09_13_at_6_11_01_PM.png

Screen_Shot_2016_09_13_at_6_15_19_PM.png

Screen_Shot_2016_09_13_at_6_15_54_PM.png
Screen_Shot_2016_09_13_at_6_16_25_PM.png
Screen_Shot_2016_09_13_at_6_16_58_PM.png


In summary, I take full responsibility for everything that happens in my car. That is why I am trying to get rid of it. I have approximately 2500-3000 miles on autopilot and it has almost caused an accident multiple times despit my hands being on the wheel.

I have personally read the manual and obviously know plenty about the limitations and issues with the system and thus am 100% entirely and completely responsible. No argument there.

However, Tesla is also responsible for making a minimal effort to educate all of its customers about possible problems.

With regards to other automobile makers and issuing statements and recalls for potentially dangerous flaws here is one example:

2015 Jeep Grand Cherokee Recalls | JeepProblems.com

"Drivers thinking that their vehicle's transmission is in the PARK position may be struck by the vehicle and injured if they attempt to get out of the vehicle while the engine is running and the parking brake is not engaged."

in stark contrast to:

Elon Musk: ‘The Word ‘Recall’ Needs to Be Recalled’

Also, Autopilot is the wrong term for what this is. And I know you will want to see this so here it is....

IMG_6215.jpg
 
Last edited:
It appears one problem may be that you are are expecting a user training session, rather than actually reading the manual given.

Citation please.

The Tesla Model S Manual contains no less than 26 pages devoted to Driver Assistance. It is not buried, but rather a main section, with the same prominence as Driving, Charging, The Touchscreen, etc...

In that section I count 59 warning icons with associated specifics, and 9 sub-sections entitled "Limitations", each with one-to-several paragraphs outlining the system limitations.

I find your assertion that "Tesla has made no reasonable effort to educate the current owners of the specific limitations..." to be demonstrably false.

Has Tesla made any attempt to provide formal or informal education on autopilot equal to or surpassing the education provided for non-safety critical functionality?

Would it be reasonable, as the owner of a Tesla to expect that Tesla would disclose within 30 days any known limitations to their systems after discovering such a limitation existed and caused a fatality?

Did such a notification occur?

Would it be reasonable for a Tesla owner, to expect to be notified that the company knows and has made adjustment to their software to prevent certain accidents which they deemed the software was a contributing factor to be informed of the circumstances which led to such a change and whether or not their car has the appropriate software updates which contains that change?


There is in fact a post on Tesla's official update page regarding the fatal accident.

I find your assertion that ". There has not been a single communication despite a NHTSA investigation, a fatal crash,..." to be demonstrably false.

please indicate in the above-referenced post where recommendations for safety directed at current owner operators were provided. Also, please share any information you have regarding the timing of this post in relation to the timing of the fatal accident. Finally, please provide any emails or other communications attempts made by Tesla at any point since the fatal crash to inform users of the scenario which led to the incident and how to avoid it.

Your characterization of the manual only containing "legal disclaimers" is also incorrect. They are actual usage instructions and specify the limitations of the system, instructing the driver to be prepared to take over. (see my references above)

I'd be interested if you have an example of either another vehicle manufacturer employing MobilEye technology, or MobilEye themselves, outlining every possible specific failure scenario.

Please provide any reference to where I actually requested that Tesla provide a list of every possible failure scenario.

In the transcribed notes Elon Musk stated that he believes the new updates "would have prevented the fatal accident."

At what time was he made aware of the possibility of making this change? When did testing of that change begin? Are any cars on the road currently operating with that change?

If the changes were made in conjunction with NHTSA is that because Tesla knew there was a safety defect which needed to be addressed?
 
Last edited:
I have personally read the manual and obviously know plenty about the limitations and issues with the system and thus am 100% entirely and completely responsible. No argument there.

So am I understanding you correctly in the manual helped you to understand the system limitations?

However, Tesla is also responsible for making a minimal effort to educate all of its customers about possible problems.

With regards to other automobile makers and issuing statements and recalls for potentially dangerous flaws here is one example:

2015 Jeep Grand Cherokee Recalls | JeepProblems.com

"Drivers thinking that their vehicle's transmission is in the PARK position may be struck by the vehicle and injured if they attempt to get out of the vehicle while the engine is running and the parking brake is not engaged."

Tesla has indeed proactively issued recalls for items. The rear seat latch plate was one. The seatbelt attachment point was another. These are indeed "problems". They've even contacted me based on a forum post to asked me to bring my car in case there was a brake problem.

However, I don't believe that a disclosed system limitation necessarily equates to a "problem" (or "defect" as you used in an earlier post).
in stark contrast to:

Elon Musk: ‘The Word ‘Recall’ Needs to Be Recalled’

However applicable Elon thinks the word was in that situation, they clearly have communicated issues/problems (see my examples above).

It seems the issue is that you think Autopilot isn't working as designed, and thus is defective.


Also, Autopilot is the wrong term for what this is. And I know you will want to see this so here it is....

That's cool and all, but is Autopilot for airplanes autonomous operation wherein the pilot isn't required to pay attention to it?
 
So am I understanding you correctly in the manual helped you to understand the system limitations?

Are you claiming that manuals alone are sufficient education for operation of dangerous machinery?

Why do we require drivers and pilot licenses?

Are there any states which place restrictions on autonomous systems and as part of those restrictions require the user be trained in supervising systems similar to Tesla's autopilot?

Tesla has indeed proactively issued recalls for items. The rear seat latch plate was one. The seatbelt attachment point was another. These are indeed "problems". They've even contacted me based on a forum post to asked me to bring my car in case there was a brake problem.

Are you claiming that Tesla monitors the forum and responds to concerns on the forum?

If so, should concerns posted on the forum be considered to have been seen by Tesla?

This are physical pieces of hardware. Can you please give me your opinion on whether or not the fact that Tesla updates to software occur over the air negate the need for issuing a recall as Elon Musk suggested? [citation pending]

However applicable Elon thinks the word was in that situation, they clearly have communicated issues/problems (see my examples above).

How did they communicate those issues?

Did they provide any information outside of the user manual? Would the user manual have been sufficient education on those issues?

It seems the issue is that you think Autopilot isn't working as designed, and thus is defective.

You are asking me a lot of questions. I am taking the time to answer them because I think your questions are fair. However, if you aren't asking a question you are making claims about what I want or perceive without any substantial reasoning behind those statements. Please consider justifying why you think this is my position?

Also, please elaborate on the difference between the two following concepts:

"isn't working as designed" and "defective"


That's cool and all, but is Autopilot for airplanes autonomous operation wherein the pilot isn't required to pay attention to it?

No, but autopilot for airplanes is highly regulated and does function as intended. I have never set a course or altitude on my autopilot and had it fly me in a different direction or suddenly drop 4000 feet. If I did, that would be a defect and the FAA would investigate and a recall would be issued.

Do you feel that Tesla's autopilot should be held to the same standards as airplane autopilot?
 
  • Disagree
Reactions: green1
Citation please.


Sure, your post: "I was given no direction or education upon delivery of the autosteer, TACC, or AEB systems other than to accept the limitation of liability statements regarding the beta software. "

Has Tesla made any attempt to provide formal or informal education on autopilot equal to or surpassing the education provided for non-safety critical functionality?

I've already answered this. The autopilot section of the manual is just as prominent and similar in scope to the other features described in the manual. An owners manual is a formal method to document features of a device or system.


Would it be reasonable, as the owner of a Tesla to expect that Tesla would disclose within 30 days any known limitations to their systems after discovering such a limitation existed and caused a fatality?

Did such a notification occur?

If the fatality resulted from an already acknowledged limitation of the system, no.

Would it be reasonable for a Tesla owner, to expect to be notified that the company knows and has made adjustment to their software to prevent certain accidents which they deemed the software was a contributing factor to be informed of the circumstances which led to such a change and whether or not their car has the appropriate software updates which contains that change?

I'm not sure the system was a "contributing factor" if the guy wasn't paying attention and instead watching movies on his phone. I haven't seen that Tesla has "deemed" that either.

please indicate in the above-referenced post where recommendations for safety directed at current owner operators were provided. Also, please share any information you have regarding the timing of this post in relation to the timing of the fatal accident. Finally, please provide any emails or other communications attempts made by Tesla at any point since the fatal crash to inform users of the scenario which led to the incident and how to avoid it.

I pointed you at the update section on the website already.

Please provide any reference to where I actually requested that Tesla provide a list of every possible failure scenario.
In the transcribed notes Elon Musk stated that he believes the new updates "would have prevented the fatal accident."

At what time was he made aware of the possibility of making this change? When did testing of that change begin? Are any cars on the road currently operating with that change?

If the changes were made in conjunction with NHTSA is that because Tesla knew there was a safety defect which needed to be addressed?

I don't suggest you requested a list, etc... my point is that there are a bunch of ways things can be missed by the system. Hence the overall limitations are disclosed, and include overall guidance such as "monitor the situation and be prepared to take control of the vehicle at all times".

Attempts to subsequently improve the system don't negate the need to heed the overall guidance.

Finally limitation != defect.