Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Initial 1000 HW2 cars getting AP software 12/31/16

This site may earn commission on affiliate links.
But each time you, as the driver, have to correct something AP does less than perfectly (or begins to do less than perfectly), the combination of mistake and correction presents to others on the road as less than perfect driving.

Because all the other drivers on the road are perfect drivers? LOL.

When I have AP1 engaged, I'm able to spend more time looking at what other drivers are doing in their cars (aka "threat assessment"). And to my great dismay, way too many of them are doing all kinds of inappropriate things (eating burritos, styling their hair, posting pics to instagram of themselves eating burritos while styling their hair while not driving) that make them less than perfect drivers. It's fricking scary out there.

I think what a lot of people miss is that for AP to be successful, it doesn't have to be a "perfect driver". It just has to be better than humans. And we are pretty bad.
 
Because all the other drivers on the road are perfect drivers? LOL.

When I have AP1 engaged, I'm able to spend more time looking at what other drivers are doing in their cars (aka "threat assessment"). And to my great dismay, way too many of them are doing all kinds of inappropriate things (eating burritos, styling their hair, posting pics to instagram of themselves eating burritos while styling their hair while not driving) that make them less than perfect drivers. It's fricking scary out there.

I think what a lot of people miss is that for AP to be successful, it doesn't have to be a "perfect driver". It just has to be better than humans. And we are pretty bad.


Lol, so true, so true...
 
It says it will take an hour but that's just the standard warning. Welcome to the latest non-AP2 firmware. The keyfob profile thing is actually nice if you have more than one driver for your car. The EQ is messed up and will forget your hard work after you leave the car.
 
  • Like
Reactions: OBX John
But each time you, as the driver, have to correct something AP does less than perfectly (or begins to do less than perfectly), the combination of mistake and correction presents to others on the road as less than perfect driving.
And each time you are using autopilot and have to override it because it needs correction, that data goes into the AP network as a fleet learning input which can then be corrected so that the car will ultimately not require that correction in the future. Instead of looking at those cases as autopilot failures, look at them as instances where you are training an artificial intelligence in how to do things better in real-world scenarios!
 
The apple is fairly down on the list of features for EAP (extended apple projectiles) where said apples will be launched from the nearest supercharger towards a precalculated, GPS-derived, holistically determined position where your car is expected to be such that you can either (a) open your sunroof or (b) open the driver side window to receive said apple. Alas Elon put the apple projectile team on hold and devoted more resources to the FSD team....can't imagine why. :)

That said, despite the delays I did locate some never-be-seen secret test footage at the Tesla skunkworks test facility...

 
  • Funny
Reactions: croman
This claim is old and there is good evidence no such learning is taking place. Don't you find AP fails in the same place every time? I have a little mental list of where I can plan to catch it everyday on my commute.
 
To me, that's a real problem. Beta has an understood meaning. Using it to mean something other than that meaning is confusing. And confusion is something that should be avoided with respect to the control software for a motor vehicle that travels on the public highways.

Also, the Beta label always seems to be an attempt by Tesla to act as if users should understand that AP might not actually work 100% reliably, and therefore Tesla shouldn't be held responsible if AP doesn't work as expected. That's a really problematic sort of shifting of responsibilities.

If a company sells a product, the product should work reliably and properly. That's especially true where the product has a real potential for injuring it's user and (more importantly) inuring others on the road and/or damaging other vehicles.

It's not good that AP often drives only as well as a teenage driver or a sleepy driver. That puts too much burden on all the other cars on the road.
Plenty other products have use "Beta" in the same way (see my "perpetual beta" link). Also, having the label act as a warning is actually a good thing, because it shows the reality that Tesla's system is not perfect (as are none of the autosteer systems out there, see point below).

As for shifting responsibility, I have not seen an example where Tesla ever used the "Beta" label to shift responsibility. Rather, they use the standard disclaimer that the driver is responsible for being alert and ready to take over. This disclaimer is present in every manual for similar systems in every auto manufacturer.

I'm not talking about AEB. AEB only kicks in if you make a mistake and are about to crash. Therefore, even an imperfect implementation of AEB is an improvement over not having any AEB. Drivers don't drive differently than they normally would merely because they are relying on the presence of AEB.

The problem is with Autosteer. An essential aspect of using Autosteer is that the driver behaves differently than they would driving a car without AutoSteer. Without Autosteer the driver steers for themselves. With Autosteer, the driver lets the car take care of the steering until the driver decides he or she needs to step in.

This change in driver behavior has a real potential for making a car with Autosteer more of a hazard to other cars on the road than a car without Autosteer. Experience has shown that a lot of drivers don't know (or disagree about) when they need to step in and how AutoSteer should be used. This is dangerous. And Tesla doesn't help by giving vague documentation/instructions for the feature and constantly changing its functions/performance. That's why Autosteer shouldn't be released unless it's extremely reliable and its safe use cases are very well documented and explained.
If you want to talk Autosteer, then let's talk autosteer. Tesla's autosteer is the best in the industry and more "perfect" than any other system (needs the least corrections and does the best job of keeping the car in the lane, even with faded lines). None of the systems however are perfect or 100% reliable, even in narrower use cases (for other systems, ping-ponging is a problem even in optimal conditions).
Semi-Autonomous Cars Compared! Tesla Model S vs. BMW 750i, Infiniti Q50S, and Mercedes-Benz S65 AMG - Feature
Hands off
There's been talk about this previously, Tesla's AEB is not the most capable, but its autosteer is. And in response to criticism about not enough nags to get people to keep their hands on the wheel, Tesla even made changes to address that.
 
  • Like
Reactions: drklain
Because all the other drivers on the road are perfect drivers? LOL.

No... Other drivers on the road aren't perfect. There are plenty of inattentive drivers out there. But if the act of repeatedly taking back control from AP is making your car seem like it is driven by a less attentive driver than if you were just driving yourself, how are you not making the problem worse?

And those drivers who already aren't paying attention while using a conventionally driven car will only be worse with an AP car. They'll use it as an excuse to pay even less attention. And that will work fine, until they need to take control of the vehicle from AP but fail to notice it is about to make a mistake. Will that cause an accident? Not every time. Most of the time the driver of the car they would have run into will react and avoid the accident. But that's also what happens now. The problem is that the AP/driver combination seems to me to be putting more burden on the rest of the cars on the road.

That's why Tesla needs to get this right. Not just release it when it has a bit of functionality kind-of-right.
 
As for shifting responsibility, I have not seen an example where Tesla ever used the "Beta" label to shift responsibility. Rather, they use the standard disclaimer that the driver is responsible for being alert and ready to take over. This disclaimer is present in every manual for similar systems in every auto manufacturer.
The 2.50.185 code that Tesla released is Alpha code not Beta code, i.e. limiting the Autopilot to 35 mph. Beta code is pre-production code that is identical to what is going into production.

It will be great when we get the version 8.1 update which is what was supposed to be delivered at the end of December. Yes I am aware of the words "expected" and "regulatory" that is in the design center.
 
  • Helpful
Reactions: oktane
And each time you are using autopilot and have to override it because it needs correction, that data goes into the AP network as a fleet learning input which can then be corrected so that the car will ultimately not require that correction in the future. Instead of looking at those cases as autopilot failures, look at them as instances where you are training an artificial intelligence in how to do things better in real-world scenarios!

Active learning off of consumer drivers correcting AP mistakes on the road shouldn't be a common way of "teaching" AP to do basic tasks. Tesla should only be relying on this sort of training once the system is nearly perfect and once all of the other methods of "teaching" (in-house testing, data collection from cars with AP hardware but without functionality, normal debugging) have been absolutely exhausted. This sort of on-the-road fleet learning with untrained consumer drivers really isn't fair to the safety of other drivers on the road. And we don't really know how much of this fleet learning is going on. It feels like Tesla (and only Tesla-- none of the other manufacturers is doing this) is using "fleet learning" as an excuse for releasing products before they are ready and for cutting the amount of time and money put into testing.
 
  • Like
Reactions: maxell2hd
I find the psychology of this fascinating...the need for owners and fans to apologize and develop excuses and defenses for a maker of a consumer product. Are you seriously defining words for them?

It's like Stockholm Syndrome for expensive products.

"expect" verb
-
regard (something) as likely to happen.

However, your referral to Stockholm Syndrome does seem to have some applicability. However more likely simply

"rationalize" verb
-
attempt to explain or justify (one's own or another's behavior or attitude) with logical, plausible reasons, even if these are not true or appropriate.
 
Plenty other products have use "Beta" in the same way (see my "perpetual beta" link). Also, having the label act as a warning is actually a good thing, because it shows the reality that Tesla's system is not perfect (as are none of the autosteer systems out there, see point below).

I don't like the notion of perpetual beta in any context. It's a relatively new thing and confuses the meaning of "beta."

But aside from that, there's a big difference between a free-to-the-consumer web-based email system being in perpetual beta, and a control system in a expensive car being in perpetual beta.

Cars way too dangerous to be released in a "we're not sure if this works" state. The consequences of a failure are way bigger than the consequences of a free email system failing.

Companies need to stand behind their products.