Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot is already improving.

This site may earn commission on affiliate links.
I was going to jokingly suggest the car is learning based on loud swearing AND steering wheel torque at the same time. But then I thought about it. Could Tesla ACTUALLY EMPLOY voice commands combined with GPS tagging so you could guide the car's behavior, much like you'd train a dog?

Bad Tesla. Now you go in the garage and think about what you've done. :p
 
There is another tesla owner up here as well. It will be very interesting to see as you get more passes if it eventually starts working.

So far, the display gives no indication that the AP knows where the edges of the road are, anywhere in the neighborhood.

Talking about behavior on curving roads on hills, there is one stretch of road near my house where AP refuses to hold the set speed, even though the curve and elevation changes are minimal, the roadway is very well marked and the display shows that the AP knows perfectly well where the lane is. Instead, even while steering smoothly the car lurches a bit on the longitudinal axis, constantly changing speeds +/- one or two mph at a time. It's weird, and quite repeatable.

I should add that this is new behavior since 7.0 was installed. Previously, TACC had no trouble holding speed through that section.
 
Last edited:
I think we have to guess, YES, since non-AP equipped cars are stated to be adding to the learning cloud information (pretty sure even EM stated this).

Scott, I think you contradict yourself here. The question was does it occur only when in AP mode. If non-AP equipped cars are adding.. that doesn't jive.

And as a follow up, where did you hear that non-AP equipped cars are adding to the learning? I hadn't seen that.

My argument, based on the fact that I build machine learning models in real life, is that this is a supervised learning problem and as such, learns when it has a labeled set/supervisor. So I feel it learns the most when AP is OFF, or it is turned off. That said, there is some reinforcement learning that can be done when in AP mode, if that's how it's coded.
 
And as a follow up, where did you hear that non-AP equipped cars are adding to the learning? I hadn't seen that.

In the press call about AP, Elon said the fleet learns from all the cars in the fleet, even non-AP cars. Personally, I think that is like how we "all get" hill hold and performance improvements -- by all, Elon meant cars with the AP hardware currently being produced, even if they are ordered without the convenience software (Autopilot) enabled.
 
My argument, based on the fact that I build machine learning models in real life, is that this is a supervised learning problem and as such, learns when it has a labeled set/supervisor. So I feel it learns the most when AP is OFF, or it is turned off. That said, there is some reinforcement learning that can be done when in AP mode, if that's how it's coded.

Yes, this make intuitive sense to me. AP is being trained by its human masters; Elon stated this exactly in the press conference. This is "learning by example" rather than "learning by doing." AP probably does not need "muscle memory" in its effector systems, it primarily needs to know better how to classify (interpret) its sensor data, by validating its models against trained human behaviour in each situation it encounters.

I still maintain, though, that there is a place for simple surveying of roads, map-building, even by non-AP cars, simply by accumulating GPS "breadcrumbs" of the total fleet of cars (mostly) rationally driven on a million miles of roads per day. I would thus distinguish between a perceptual "model" of lane recognition and predication, and simple high-accuracy mapping as a "backbone" to constraint the interpretation of that perceptual model.
 
I wonder if we should be letting the AP "fail" as a method for Tesla to gain data as to what triggered the fail-event (versus deactivating AP when approaching a tricky area). I'm talking about letting the AP start to take the exit ramp instead of the intended course on the freeway. And then at the last moment, driver-correct. Of course only when absolutely safe to do this. For liability reasons I doubt Tesla would ever ask us to do this. When the AP fails while being pushed to the limit, it should provide very useful data for making the future software even better
 
I haven't seen this question in the thread (sorry if I missed it), but does the AP learning occur only when in AP mode? So no stealth learning if AP is off?

I think we have to guess, YES, since non-AP equipped cars are stated to be adding to the learning cloud information (pretty sure even EM stated this).

Scott, I think you contradict yourself here. The question was does it occur only when in AP mode. If non-AP equipped cars are adding.. that doesn't jive.
<snip>

I read "stealth learning if AP is off" which I correlated to non-AP equipped cars so that was the angle of my answer which I think it still valid. The follow up remark about that it may only work for current non-AP cars is quite plausible to me tho.

- - - Updated - - -

I wonder if we should be letting the AP "fail" as a method for Tesla to gain data as to what triggered the fail-event (versus deactivating AP when approaching a tricky area). I'm talking about letting the AP start to take the exit ramp instead of the intended course on the freeway. And then at the last moment, driver-correct. Of course only when absolutely safe to do this. For liability reasons I doubt Tesla would ever ask us to do this. When the AP fails while being pushed to the limit, it should provide very useful data for making the future software even better
That makes perfect sense to me. The driver overriding the AP would be a perfect automatic trigger for the car to know it needs to capture exception data.
 
I still maintain, though, that there is a place for simple surveying of roads, map-building, even by non-AP cars, simply by accumulating GPS "breadcrumbs" of the total fleet of cars (mostly) rationally driven on a million miles of roads per day. I would thus distinguish between a perceptual "model" of lane recognition and predication, and simple high-accuracy mapping as a "backbone" to constraint the interpretation of that perceptual model.

I don't disagree with this outright, and think there are probably multiple layers. That said, the thing that defines machine learning is generalization. The goal isn't to have the machine learn very specific actions for very specific places. The benefit is to be able to generalize well to new inputs.

I wonder if we should be letting the AP "fail" as a method for Tesla to gain data as to what triggered the fail-event (versus deactivating AP when approaching a tricky area). I'm talking about letting the AP start to take the exit ramp instead of the intended course on the freeway. And then at the last moment, driver-correct. Of course only when absolutely safe to do this. For liability reasons I doubt Tesla would ever ask us to do this. When the AP fails while being pushed to the limit, it should provide very useful data for making the future software even better

I have seen this type of comment a number of times, so I'm not picking on you here. But not only is this concept scary and borderline unsafe, it's generally incorrect. As a simple example, imagine teaching a child how to cross the street. You teach them to stop at the curb, look both ways, then cross. You don't run them halfway into the street, scream, and run back. You want to model the correct behavior from the start.

If indeed the model is learning when humans are driving, and I feel quite confident that this is the case, the best thing to do is to drive on unrecommended AP roads (two lane twisting highways, etc), with AP off, and drive your absolute best each time. Think of it as setting a proper example.
 
I wonder if we should be letting the AP "fail" as a method for Tesla to gain data as to what triggered the fail-event (versus deactivating AP when approaching a tricky area). I'm talking about letting the AP start to take the exit ramp instead of the intended course on the freeway. And then at the last moment, driver-correct. Of course only when absolutely safe to do this. For liability reasons I doubt Tesla would ever ask us to do this. When the AP fails while being pushed to the limit, it should provide very useful data for making the future software even better

I am not sure this is true. It do not think AP needs to actually fail, it just needs to make a prediction or plan, and then compare it to what YOU actually do. If the two diverge, that is the trigger to refine and learn.

As an analogy, it is like a human teacher asking a human student "what WOULD you do in this situation?", rather than actually having them demonstrate the plan, fail, and then correct them. The teacher can just explain the failure of the PLAN, and the student learns the lesson.
 
All I know is that I just rented a 2015 Tesla tonight for a week to see if Autopilot is really up to snuff to justify the purchase - and it's incredible. I drove in the right lane by several off-ramps and it did not even try to dart to the right. This is on a stretch of the I-10 eastbound in the Inland Empire between Ontario and Palm Springs.
 
All I know is that I just rented a 2015 Tesla tonight for a week to see if Autopilot is really up to snuff to justify the purchase - and it's incredible. I drove in the right lane by several off-ramps and it did not even try to dart to the right. This is on a stretch of the I-10 eastbound in the Inland Empire between Ontario and Palm Springs.

Another data point suggesting that fleet learning is already underway, even without a 7.01 release. Unless, of course, every exit on that stretch of highway has a dashed line defining the start of the exit lane: in that case, the correct behavior seems to have been built into the initial release of 7.0.
 
So far, I find autopilot has a lot of difficulty with low median strips with angled verticals. ( like mini ramps). We have lots of these on major through roads, and they are not painted. Many times now I have had to take control because it was steering straight for the median. Taking the middle lane of 3 was most predictable of all. Bounded both sides with unbroken dotted line.
 
So far, the display gives no indication that the AP knows where the edges of the road are, anywhere in the neighborhood.

Talking about behavior on curving roads on hills, there is one stretch of road near my house where AP refuses to hold the set speed, even though the curve and elevation changes are minimal, the roadway is very well marked and the display shows that the AP knows perfectly well where the lane is. Instead, even while steering smoothly the car lurches a bit on the longitudinal axis, constantly changing speeds +/- one or two mph at a time. It's weird, and quite repeatable.

I should add that this is new behavior since 7.0 was installed. Previously, TACC had no trouble holding speed through that section.

I'm a little confused by what you wrote above.

On the one hand it sounds like this reduction in speed is happening with just TACC on, and without Auto Steer Beta engaged, because you say that TACC had no trouble holding the speed, and also say that around your neighborhood the AP gives no indication that it knows where the edges are.

But on the other hand, when describing the issue of the car slowing down, you talk about the roadway near your house being well marked, and the display indicating that AP knows where the lane is.

So is this "slowing down" behavior occurring with or without Auto Steer Beta engaged?

Thanks.
 
I have an interesting 'fail' report from this morning, at the right exit on a left-hand curve pictured in post 180 on this thread. AP has been handling it fine, but this time there was a car slowed to a near stop at the end of the right-turn lane. Because of the left-hand curve into the road, the entry into the turn lane is straight ahead. My car interpreted the nearly stopped car as an obstacle in my path, hit the brakes hard and disengaged AP. Because I guessed it might do something like that as soon as I saw the other car, I had my hands nearly touching the wheel and already checked that there were no cars behind me. I was looking out the windshield, not at the dash, so I'm not sure if it displayed "hands on wheel" or "take control", but I heard the chime for AP disengagement the instant it hit the brakes. AP has been working well at that spot in recent days. I guess it just shows that driving is complicated from an algorithmic point of view.
 
There will always be a lot of situations where autopilot won't be able to make out what is happening on the road. This is especially true since 99% of cars out there are driven by the humans all thinking different thoughts! This being said, even up here Calgary Canada where we don't have too many Tesla's roving around, I have noticed Autopilot learn the routes I drive most and now has a perfect drive whenever I engage it's use. This was a far cry from when I first started using autopilot since our lane markings aren't the greatest. Because of this, I am inclined to agree that the learning is indeed happening. A few tugs on the steering wheel correcting the car is all it needs to "train" your car to do the right thing at that point on the map. The thought of training your car like this nearly blows my mind.
 
Interesting day today with AP.

I took a route I tackle weekly and is the most AP stable of my usual routes.

The car did terrible. Really really terrible. At times no lane markers would display on my dash where previous times it would show up with no problem.

At first I thought it could be glare from the sun seeing I was taking the route a little earlier than usual. However. Same thing happened much later in the day.

The only difference was wind. 30 - 40 MPH wind.

Wonder if that might have caused some trouble. Will know more tomorrow when I take another trip on a different road. Maybe its sensors.
 
At first I thought it could be glare from the sun seeing I was taking the route a little earlier than usual. However. Same thing happened much later in the day.

Just throwing this out there, but did you have a dirty windshield? I wonder how much that affects the AP camera mounted behind the mirror. Remember that where we see an image of the road, a computer sees a constantly changing array of numbers indicating the intensity of each color for each pixel. Generally they're quite good at removing noise from dirt or haze, but just figured I'd ask.