Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

17.17.4

This site may earn commission on affiliate links.
Apparently not. And yet Elon Musk says that by the end of 2017 this car will drive unattended from L.A. to New York. I think that's about as likely as it driving from London to New York.

I'm very down on AP2 performance so far (far below AP1 parity after 7 months wait); however, the efforts to develop a more autonomous capability, which will find its way into the FSD feature, are probably very independent of Tesla's efforts to reach AP1 parity. They are basically focusing on replacing the lost functionality of the MobileEye chipset (and AP1 approach), while in parallel training the full sensor suite to handle more advanced tasks with more advanced machine learning-based approaches. I'm optimistic that we will see major strides in "intelligent" driving behavior (EAP and pre-FSD/L4), particularly for those who purchased FSD (like me ;) ) soon.
 
  • Like
Reactions: Helmuth
FWIW, I've been on 17.17.4 for a while, and I seriously believe things are improving. Last weekend, I noted (for the first time) a consistent pattern of correction of lane position when cars/trucks are detected in neighboring lanes (away from them). I did not see this in early 17.17.4. I may be imagining things, but I really believe that incremental improvements have been happening within the release.
 
  • Like
  • Informative
Reactions: NerdUno and croman
I don't think I've ever seen Tesla claim that our cars' behavior improves without a new release being downloaded.

Yes, but I too have noticed 17.17.4 consistently able to handle situations where it failed when I first got it. Not sure why it would "improve" when on the same roads under the same conditions, at the same time (in the same lane, travelling approximately the same speed).
 
Not sure why it would "improve" when on the same roads under the same conditions, at the same time (in the same lane, travelling approximately the same speed).

It's not really possible to re-create the same conditions. The light changes constantly in ways that can be important to a camera but not to the human eye. The traffic pattern around you changes constantly too. That would imply that half the time things would get worse, not better, but our brains may be biased toward noticing improvements more.

I think if Tesla actually had this "learning" feature they would be bragging about it, not hiding it. The car can certainly remember places: it opens my garage door when it nears the house, and raises the suspension in places where i've raised it in the past. Do they use this data to fine-tune navigation? I'm a skeptic.
 
  • Like
Reactions: Swift
FWIW, I've been on 17.17.4 for a while, and I seriously believe things are improving. Last weekend, I noted (for the first time) a consistent pattern of correction of lane position when cars/trucks are detected in neighboring lanes (away from them). I did not see this in early 17.17.4. I may be imagining things, but I really believe that incremental improvements have been happening within the release.
I think you are imagining things. There are times when it seems a little better but also sometimes a lot worse
 
  • Like
Reactions: Tomnook
Curious, at what speed? I too saw my MS move left in the lane, but at 80 MPH, it was after I passed a truck to my right. Noticed it twice. Will have to try at a lower speed...

Yes, I agree that sometimes the adjustment comes a little too late, but at least there is a correction now, which I didn't see before. I'm referring to highway speeds and also in dense urban downtown multi-lane traffic (<< 25 MPH).
 
  • Love
Reactions: Snerruc
That sounds like the Side Collision Warning system got triggered... (Which you have even without paying for EAP.)

HKm4j8J.jpg
Yes, thank you @MP3Mike for finding that; that's exactly what I'm experiencing.

After you posted but before I read your post, it happened to me again, and here are the notes I took immediately for accuracy:
  1. For reference (which I stated thread up), I have no optional car self driving assistance features; I can't double-pull-back on the left stalk (I forget which left stalk) to make the car do anything at all ever. I did not pay for EAP or anything similar, and I don't have it. What this means is that at no time was the car ever driving itself except when it was wrong to do so (keep reading for details).
  2. I was once again going Northbound on Highway 17 in the hills (and of course almost all of that is in curves, so was naturally in a curve as usual)
  3. Ever since the upgrade on Highway 17 to having concrete barriers in the center, most of it has been modified to have concrete barriers in the median. Most of that is immediately adjacent to or within a foot (12") of the traveling portion of the left (#1) lane where you must be since immediately to the right of that space is the right (#2) lane; this represents much more room than there used to be -- there used to be approximately 1/2" - 2" of space (half an inch to two inches) between the barrier and the part of the lane you must drive in; it used to also be less bumpy and cleaner there -- recently, they reworked the road and made that 3" to 10" section dirty and bumpy, and not as well lined up, so actually more dangerous than when it was a clean 1/2" - 2".
  4. I was in such left lane.
  5. I was not going to hit the concrete barrier
  6. My turn was properly directed (by me) through the road curve before and until this happened.
  7. The car took control of the steering and pushed me right, over into the right lane (over the white line dividing the lanes (definitely placing part of the car into the right lane out of my lane)).
  8. I counter-reacted against the car steering itself strongly enough that it let me regain control and steer it back into my own lane (where I already had it). In other words, I think it engaged my manual override over the car's steering.
  9. I still want to know if there is a physical override capability or if it is just a logical software override. Now that I know at least there is some override, I don't feel as afraid of it.
So, here are some of the negatives:
  • It was wrong both times it happened; I was not too near the edge of the concrete barrier (I was definitely at least an inch or two away, more likely about six to ten or even twelve inches away), I was not going to hit it, and I was in a smooth curve as intended by the highway both times (but what if I was switching lanes and this happened? That would be even more to deal with!)
  • Both times, it shoved me into the other lane, one time with a car there, another time over the white line where there could have been a vehicle (if it were something my car could have been caught in, it could have just been a complete deathtrap).
  • It decided a lane with unknown traffic was something safer to shove me into than either the proper lane I was already completely successfully in or a semismoothish solidish concrete barrier on the left.
Here are some of the not so negatives:
  • I could override its error
  • No, that's it; I thought I could identify more positives. I can't.
Perhaps I wanted to write that it seemed like it had some sense of trying to protect me and that it seemed plausible that this could be something similar to that concept, but that actually doesn't logically calculate to mean a positive.

Final comment:
  • It steered away from an actual concrete obstacle rather than some hallucination it had, although, it might have still been hallucinating where that concrete obstacle was exactly. Come to think of it: is it possible it had some type of mirage? That could explain its understanding of position being wrong. Its reaction was less mature than its original error; its original error was that there was some danger on my left away from my lane and away from my direction of travel; its reaction was to therefore shove me into an occupied lane on the right, even though I was already in a perfectly good lane.
I reported the bug the first time, but not the second. But, this is not a good bug!


This is the third time I can recall that my Tesla has decided to drive on its own in ways I did not tell it to drive, and all three times it could have caused an accident, the first one being the most potentially deadly of them but all of them very potentially dangerous. Luckily I've never had an exact repeat of the first time; I immediately reported that bug, and I wonder if they tried to fix it and succeeded to some degree.
 
Last edited:
  • Informative
Reactions: Sawyer8888
I wonder if they tried to fix it and succeeded to some degree.

I don't think they have the ability to fix anything on the vehicle without downloading a new firmware version.

Thanks for this very thorough description. The configuration of the lanes and barrier as described aren't all that unusual, and the phenomenon isn't being reported widely, so that suggests that your "mirage" theory may be correct, i.e. that something about that particular topology is making the computer see something that isn't there.

If the side collision avoidance system is really capable of pushing the vehicle into an occupied lane, that raises serious questions even it had been avoiding a legitimate threat. It's a version of the Trolley Problem ethical thought experiment (Trolley problem - Wikipedia), with the wrinkle that the programmers have plenty of time to decide whom to sacrifice rather than needing to make an impulsive decision.

Emergency braking, in the face of an inevitable collision, is almost certainly the right thing to do at all times. Emergency steering is a whole different thing.
 
For record: my notes are at May 26, 2017 07:12 AM, so that's after it happened. I looked it up in TeslaFi, and that's way in the flatlands, so I think I must have experienced it up to a dozen minutes earlier.

I just looked at a spot after Laurel Curve that I seem to recall it happening, but I don't know if that was the first or second time, or if it was either time, but that spot is around 7:03AM that morning's drive. I entered the hills at 7:01AM and exited them at 7:12AM. It makes sense that I paid attention to editing my notes last at the first non-hill straightaway part of the trip (for safety; I don't avert my attention from the active driving portion of the road); I used Google Voice to Text to enter the data into Notes.
I don't think they have the ability to fix anything on the vehicle without downloading a new firmware version.
Well,
  1. They can. They probably don't often.
  2. I wasn't claiming they did; that was many versions ago (when I was crossing traffic to get onto a side road when it didn't let me accelerate across the lanes after I was already partially into oncoming highway traffic, not very coincidentally on the same highway). Here's my post: My first "automated assistence" experience almost killed me.
Thanks for this very thorough description. The configuration of the lanes and barrier as described aren't all that unusual, and the phenomenon isn't being reported widely, so that suggests that your "mirage" theory may be correct, i.e. that something about that particular topology is making the computer see something that isn't there.
I see more and more barriers like that in all roads all over the state (of California). I don't know how many are in hills on curves. Highway 17 has a plethora of that.
If the side collision avoidance system is really capable of pushing the vehicle into an occupied lane, that raises serious questions even it had been avoiding a legitimate threat. It's a version of the Trolley Problem ethical thought experiment (Trolley problem - Wikipedia), with the wrinkle that the programmers have plenty of time to decide whom to sacrifice rather than needing to make an impulsive decision.
  1. I don't want to repeat the experience to prove anything. I'd rather not experience it again than to gather evidence.
  2. If I could go back in time and have temporary access to sufficient resources to place proper cameras to document what happened in place, and document what happened that way, I would. Primary question I'd want answered: how far both times over the line did it send me?
And primary question I want answered can still be answered without time travel: is the override all software or is there a hardware override? That is, is the car fully capable of killing me if I have full understanding of and physical action to steer to prevent that, because the car is fully capable of ignoring me?

That's not the Trolley problem at all; that's just plain murder.
Emergency braking, in the face of an inevitable collision, is almost certainly the right thing to do at all times. Emergency steering is a whole different thing.
Yet, every time someone decides to go excessively slow in front of me and my early warning beeping system goes off, I'm the one who has to react, not it (the car). (I have that warning set to super early, and I find it actually pretty good, but it doesn't catch every time.) (The "super early" setting is the latest possible useful time to hear that warning; Tesla sure likes cutting everything too close.)
 
Last edited:
That's not the Trolley problem at all; that's just plain murder.

I understand, but it brought to mind this: What if the barrier was really, e.g., a cement truck about to hit you from the left, and there was a motorcycle alongside you on the right? Should the system leave you in your lane and let the truck kill you, or swerve to the right and kill the motorcyclist?

My point is that programmers actually need to make these decisions a priori. The current situation, where the driver makes a split-second decision, is much less ethically fraught.

Sorry for the OT. WRT your steering wheel question: I don't know, but I think a determined human can always overpower the decisions made by the software. We really need a statement from Tesla about that. We don't need another HAL doing what's best for us.
 
  • Love
Reactions: NerdUno
I understand, but it brought to mind this: What if the barrier was really, e.g., a cement truck about to hit you from the left, and there was a motorcycle alongside you on the right? Should the system leave you in your lane and let the truck kill you, or swerve to the right and kill the motorcyclist?

My point is that programmers actually need to make these decisions a priori. The current situation, where the driver makes a split-second decision, is much less ethically fraught.
Absolutely true. For the record, Mercedes already answered that question by saying they will always save the driver/passenger of the car making the decision. It's a really easy way to decide that ethically, and isn't fraught with confusion at all: I pay for a car to do what's right for me.

Since Elon's the sort of person to waffle on this type of thing, and may decide against the car owner, I may be forced back into Mercedes at some point.
WRT your steering wheel question: I don't know, but I think a determined human can always overpower the decisions made by the software. We really need a statement from Tesla about that. We don't need another HAL doing what's best for us.
I agree.
 
  • Helpful
Reactions: NerdUno
I want to continue to praise Forward Collision Warning as experienced in my Model S AP2 HW 17.17.4 firmware. I have it set to EARLY.

As a probability input, it helps my reaction time.

This morning on Hwy 17 (North over Santa Cruz Mountains), the Forward Collision Warning (which I have set to EARLY) went off (sounded its alarm, for those not familiar with English) at the same time as I realized a sudden traffic slow down in front of me, and as parallel inputs I'm pretty certain helped confirm in my mind in my multiple mental pathways to react faster. I not only was able to stop in time, but I was also able to get on the horn quickly enough to warn the blue car behind me that I was coming to a sudden slowdown. (Yes, that's how I use my horn, and it works great, this time being no exception, with the blue car behind me diverting around mine (I was in left lane and they shifted to the right lane). I think it would be a good idea if horns were installed in the rear of all cars to blare out the back as well as the front, since most the people I ever need to warn are behind me.)

As before, I want to warn people not to start to depend on this feature. I've caught it fail to warn me a number of times of sudden slowdowns (while I was paying full attention and had no problem reacting, but which I gauged it would probably have alarmed if it was interpreting correctly); many of those instances had visual sophistication which probably confused the baby programming of our AI cars, but again, that's no excuse not to pay full attention to the driving. But in more than one instance, it told me of things going on in front of me while I was in the middle of a quick mirror check and got the information to me faster than I would have myself greatly improving my net reaction time, and as a cognitive aid, it is great as well (as I explained above in one example).

Forward Collision Warning (set to EARLY which does give it lots of hand-holding warnings) properly used as additive assistance (not replacement assistance) is a great aid, and I recommend it to everyone. (I specifically do not recommend it set to anything but EARLY, because it's just on the cusp of usefulness in its current timing parameters, so I can't imagine any later warning being of actual utility, except to confirm how much of a dolt you are.)

It is available on all AP2 HW 17.17.4 cars and many earlier firmwares, regardless of optional software assistance packages; I purchased none. I presume some similar concept is available on AP1 HW cars, and I'd highly recommend looking into its usefulness as well if that is your situation, but I'm here to report the HW2 version working well as additive assistance.

The sound it makes is 3 charms. I doubt that's of material importance; it is intuitive enough to alert you to what's going on. I guess I should caution you not to look down at the dash because of the charms; doing so would assuredly cause a horrific crash, but that should be obvious to any logical person.
 
Last edited: