Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Still Waiting for Elon's Blog Post on Autopilot Update...

This site may earn commission on affiliate links.
I can't keep up with these speedy goalposts. I'm out.

I hope you are able to sell your car.

I know because even though we have never met I knew enough about the way you were asking me questions that as soon as you were not the aggressor and were faced with the references and facts you requested as well as pertinent questions you would also find that you don't have time for such nonsense.

Next time, just have a conversation and if you disagree say so.

And, you did not provide any references to your response limitations != defect and that is not the question I asked.

You should hold yourself to the same standard you hold others. I would love an education on the question I asked.

What is the difference between "not functioning as intended" and "defect?"
 
Last edited:
  • Disagree
Reactions: green1
I know because even though we have never met I knew enough about the way you were asking me questions that as soon as you were not the aggressor and were faced with the references and facts you requested as well as pertinent questions you would also find that you don't have time for such nonsense.

I wasn't the one asserting untruths such as "Tesla hasn't disclosed limitations", etc... when in fact they were in the manual.

Now if you want to say you don't LIKE them, etc... that's fine.. but that wasn't your original premise this stemmed from.
 

Thank you, and in that post I was specifically referencing the traffic turning left across the lane. Can you please provide me a reference to where in the manual it specifically mentions that limitation or any other communications from Tesla mentioning that limitation?

Also, back to my original point...

How much effort would it take for Tesla to produce something similar to what they have produced for their key and every other system in their computer for autopilot and its limitations?

I would imagine if they have time to provide education outside of the manual for these things (keys, trunk, screen, etc.) they have the time to provide one for the thing that is running people off of the road and into trucks.

Also, they spend a considerable amount of time educating new owners on their car on delivery but it is not their policy to have the owner drive the car with a Tesla educator present and to have the autopilot functionality demonstrated in the proper way to use it.

However, they DO provide similar education for the sunroof, trunk, key, frunk, etc.

Why don't they provide that as a service to their owners and in an attempt to decrease these accidents?

I want to remain clear here... I do not think Tesla is responsible for any of the accidents or the fatality.

What I do believe is that as a result of these accidents Tesla should recognize they have a problem and make a reasonable effort to address that problem for current and future owners by providing the same level of education they provide for other components for their autopilot.

There is no reasonable argument against it, IMO. The only thing you can say is "they shouldn't have to." Well, that is really irrelevant, the world is the world and people are the way they are. I "shouldn't have to" deal with the drunk and high people who attack me at work but I know better than to think I can change the world and instead focus my energies on doing the best I can to help them.

We should all stop talking about how "everyone else" is stupid or irresponsible and consider the fact that "everyone else" are the drivers we share the road with and make reasonable efforts to improve safety by accepting the world and its inhabitants the way they are and pragmatically and responsibly working within the limits of reality to improve our circumstance.
 
Page 69:

Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.
 
  • Helpful
Reactions: efusco
I don't have a clue what the "average" Tesla buyer is like. I have test-driven AP, and I have watched the AP commute video, and I have read all the accident stories I find on this forum and the Tesla forum. I watched the Chinese scrape video. I read the 8.0 release blog post and all 7 web pages of the press conference transcript at Electrek. I've read most of the Owner's Manual twice. I've ridden across Europe with Llewellyn (in video) and I think I am about as well-educated as I could be. I think the enforced wait built into buying a Tesla encourages soaking up facts (in lieu of a real car).
 
I guess, yes! Reading through this is interesting, because I could not understand how someone could not understand AP. I've never had any trouble at all, but it may be because I never flew an airplane with AP. ?? Everybody is different.

The problem with this kind of thing is there is always going to be a certain percentage of the population that can't be taught. It doesn't matter if it's ABS, or Autopilot.

If we catered to stupid we would never have nice things.

Could they do a better job at training? Would it prevent some occurrences? Probably, but likely not as many as people think. Like Elon Musk said it was typically the experienced ones that were problematic. It's just like how statistics say crashes are more likely close to home where you feel comfortable driving than some other city far from home where you're being really careful.

We also live in a world where people try to skirt responsibility of their actions. So no matter what they're going to blame autopilot just because it's there. There is something else for the driver to blame. For good and for bad any Tesla accident gets scrutinized. Everyone is watching, and likely coming to the wrong conclusions because they're not seeing the bigger picture. We don't know what problems other Lane-Keeping/ACC cars have. No media outlet is going to pay attention to a MB crash even though MB has drivepilot (which is very similar to AP).

I'm not a huge Tesla fanboi. I've been pretty critical of them, and remain pretty critical of some aspects of Tesla. But, I feel like the limitations in 8.0 were necessary for where AP is currently at. That maybe they jumped the gun a bit and this is a bit of a correction.
 
I guess, yes! Reading through this is interesting, because I could not understand how someone could not understand AP. I've never had any trouble at all, but it may be because I never flew an airplane with AP. ?? Everybody is different.

Language and cultural barriers are real issues and probably contributed to the crash in Montana.. Additionally, there is a lot of misinformation out there on autopilot. LastGas is actually a great example here. The video he watched "Autopilot commute" is not what he should expect his autopilot experience to be like. Mine has been nothing like that.
 
Language and cultural barriers are real issues and probably contributed to the crash in Montana.. Additionally, there is a lot of misinformation out there on autopilot. LastGas is actually a great example here. The video he watched "Autopilot commute" is not what he should expect his autopilot experience to be like. Mine has been nothing like that.

What I took away from the Autopilot commute video is that Autopilot will sometimes get confused topping a hill, that it works better when following another car, that it may have problems with curves particularly on secondary roads, that it stops more abruptly than a human driver would, and that it gets better over time. And I would add that the driver in that video kept his hands near the wheel and was consistently engaged and attentive to driving throughout.

So that I won't get killed by the car, tell me what I missed?
 
No, but autopilot for airplanes is highly regulated and does function as intended. I have never set a course or altitude on my autopilot and had it fly me in a different direction or suddenly drop 4000 feet. If I did, that would be a defect and the FAA would investigate and a recall would be issued.

Then you're fortunate. These things are machines and they will fail, irrespective of whether they're certified or not. Twice I've had an AP initiate a sharp roll left that, unchecked, would have rolled the plane over. It was the result of an electrical issue in the AP controller that pinned the aileron servo to the left. It really drove home the "don't ever trust the autopilot" message. I made absolutely certain that I could flip out the A/P breaker blind and I always fly one hand on the yoke. Sound familiar?

Beyond that, I've had an A/P fail to intercept a course, glideslope or localizer many times, for various reasons. And I'm sure you're well aware that flying A/P in icing conditions or really turbulent weather in a small plane is verboten.

My point is simply that any machine fails and has it's limits. The Tesla A/P is just one more machine and people need to learn it's limits. I expect that, as with any new technology, people will rapidly get accustomed to what it can and can't do and these "idiot" incidents will go away. If not, then we'll never see any real degree of autonomous driving because perfection won't happen.
 
FWIW, and given that I am only a "sample size of one":

I only know about the edge-conditions of AP, and where it might let me down, from reading this forum. I have not read the manual (other than the slim volume that came in the car's glove compartment) - clearly I should have read it ... some future drivers won't, or will miss the significance of the description of some of the edge cases. I was not aware that there were Training Videos on the Tesla website (until someone here pointed me to them - but even that person said that whilst all the videos were very useful the AP one didn't point out the potential issues). So I've learnt enough to keep me alive from this forum; my guess is that only a small portion of owners (increasingly so, as M3 rolls out) will read the forums avidly ...;

When I used AP the first time I said "Yes" to the T&C's screen. No other driver of the car since then has had to read that. Maybe a new owner would (e.g. if I set the car back to Default when I sell it). So we are left with the "Keep your hands on the wheel" warning each time AP is engaged and each new driver discovering that if they don't bother to hold the wheel it is 5 minutes or so before the car starts to complain - obvious conclusion: no need to keep hands on the wheel.

I never had instruction on Cruise Control on the first car that I owned which had that feature. The trial and error I did with that would have been better, and safer!!, replaced by someone giving me some tuition: much more so on the first car I had with TACC - that was a VW Golf and it was absolutely abysmal, it would hair up behind a car and jump on the brakes at the last moment, very unnerving for a passenger!! A video instruction would be sufficient (I don't feel the need for a delivery specialist to instruct me, and video would be available to all future drivers)

I probably would have [previously] thought that the term "AutoPilot" defined a more autonomous animal that it actually does; I have learnt the more accurate definition in this forum. If I had thought about it at all I would have assumed that all it could do was "fly straight and level" and not be capable of avoiding other traffic etc. just as it does in an aeroplane - I'm well educated and capable of making that deduction ... but ... the marketing hype etc. that I absorbed around the time I was buying / waiting for the car made me assume that AP was very capable. Not to the extent that I could go to sleep!, but my assumption was that I could get on the highway, select AP and then just be on the lookout. I would NEVER have guessed that the car in front of me swerving for a stopped vehicle would be a problem that **I** had to react to; if I think about that, now rather than then, clearly I should not expect AP to swerve (there might be something in the way) and it might be a long shot that it could stop in time once it saw the obstacle (and I now know that AP can be blind to anything stationary) ... but I never had that thought before, so I would have been a liability on the road when using AP.

I don't think the initial Delivery Specialist training needed to cover all that (plus it would not have been useful to any other driver of the vehicle in future), but a stern warning that there are edge conditions and that I should be sure to watched the videos (as well as their being a better, more comprehensive video of "edge" conditions available) would be a huge improvement.

I use AP a lot. I have a regularly repeated long journey at night which is identical to previous ICE-days, and I am certain that AP is less tiring and leaves me more alert. I was sceptical until I had had that comparative experience a number of times using AP. Plus Elon says that AP is X-times safer than a human, so even though I'm an utterly brilliant driver (tm)!! I have the additional benefit of that extra margin of safety when driving with AP.
 
  • Like
Reactions: Zybd1201
Then you're fortunate. These things are machines and they will fail, irrespective of whether they're certified or not. Twice I've had an AP initiate a sharp roll left that, unchecked, would have rolled the plane over. It was the result of an electrical issue in the AP controller that pinned the aileron servo to the left. It really drove home the "don't ever trust the autopilot" message. I made absolutely certain that I could flip out the A/P breaker blind and I always fly one hand on the yoke. Sound familiar?

Beyond that, I've had an A/P fail to intercept a course, glideslope or localizer many times, for various reasons. And I'm sure you're well aware that flying A/P in icing conditions or really turbulent weather in a small plane is verboten.

My point is simply that any machine fails and has it's limits. The Tesla A/P is just one more machine and people need to learn it's limits. I expect that, as with any new technology, people will rapidly get accustomed to what it can and can't do and these "idiot" incidents will go away. If not, then we'll never see any real degree of autonomous driving because perfection won't happen.

That's weird. The quote attributed to me wasn't from me.