Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Firmware 7.0

This site may earn commission on affiliate links.
On a different not, apparently 7.0 is going to be released Friday, Tesla time.
I sent my Autopilot observations, complete with Google Map information for streetview to the service address suggested up this thread. The response was along the lines of "we've sent it to the person responsible, thanks for your input." I'll be interested to see if they ask any follow up questions. Good to see that they actually do accept email and respond with confirmations... I have a hard time believing other car companies would do as much!

However, with the Musk tweets you included, I'm wondering where best to send 7.1 requests? The fewer forwards necessary to get to the right person would improve the odds of the message actually getting there.

I've got plenty of thoughts, believe me... from bringing back the 'summary' of what's going on in the climate control system on that auto button, to showing actual power on the power app (if it's going to stay small and somewhat useless as we've seen in 7.0).

And yeah, that whole metric vs. imperial pressure thing... ;-)
 
Let's stop with the metric vs imperial discussion and try to stay on topic please.

Good idea, thanks.

On a 7.0-related note, autosteer still hasn't learned to take the curve properly for my commute from San Francisco to Berkeley (I-80 east to I-580W/80E onramp in the carpool lane). It still goes straight through without following the lane markings, which are very clear and solid lines (yellow on left, white on right) so please be careful if you take the same route!

EDIT: here's a satellite map of the split between the carpool and non-carpool lanes I'm talking about (the red circle is where the car goes straight instead of staying in the left-most carpool lane):


Screen Shot 2015-10-22 at 11.10.46 AM.png
 
Last edited:
On a 7.0-related note, autosteer still hasn't learned to take the curve properly for my commute from San Francisco to Berkeley (I-80 east to I-580W/80E onramp in the carpool lane). It still goes straight through without following the lane markings, which are very clear and solid lines (yellow on left, white on right) so please be careful if you take the same route!
Do you have a google maps link to that spot? It sounds like you're seeing exactly the opposite behavior to what I was noting up the thread... in my case, it was following the yellow line far too literally while ignoring the fog line... your situation sounds to be exactly the opposite.

Like so many technical things, it would be nice if a more detailed explanation of how the system thinks and how it interacts with crowd-sourced information was available... the release notes are skinny on the details.
 
Do you have a google maps link to that spot? It sounds like you're seeing exactly the opposite behavior to what I was noting up the thread... in my case, it was following the yellow line far too literally while ignoring the fog line... your situation sounds to be exactly the opposite.

Like so many technical things, it would be nice if a more detailed explanation of how the system thinks and how it interacts with crowd-sourced information was available... the release notes are skinny on the details.

Yep just updated my post with a image of the lanes as you were writing yours!
 
Yep just updated my post with a image of the lanes as you were writing yours!
So tell me if I'm understanding the problem correctly... you're in the lane with the diamond paint mark, heading from left to right. The yellow line above the diamond lane separates left-right and right-left HOV cars (so there are three lanes headed right and one lane headed left). Your car wants to head straight through the triangular no-man's land to carry on with the two lower lanes? I'm assuming you manually correct so you don't know where it would eventually take you if left to its own devices?
 
Read through as much of the thread as I have time for today (got due dates on my code, you see), and wanted to give a few warnings from what I've seen trip up lanekeeping/autopilot systems (specifically, the one on my Ford, which from what I've read in this thread is like a moronic older cousin of the Tesla's system). If any of these has been mentioned please disregard, my apologies as I don't have any more time to read!

1) Blacktop road patches seem to confuse the system at certain times of the day. I'm not talking so much the square ones but these long and narrow ones.
1.5) The water that sometimes gathers in these "gutters" before they get patched.
1.75) These black crack stripe things. They get wet and look lighter than the road when you go into the sun.
2) At night, on a night when the dew has settled heavily or it's rained recently, it'll sometimes confuse the tracks of other cars for lines (because of the oncoming headlight reflections.
3) As people have mentioned, it loves to follow off-ramps, especially at the bottom of hills at night.


I realize that obviously Tesla will completely kick my Fusion Energi's (Hey, it's what I could afford at the time, don't look at me like that!) butt at lanekeeping/Autopilot because "duh", but these are things I've seen fool the system, and on occasion even make me squint to figure out what the actual lane markings were, so I figure if I can warn ahead and save someone from shaving a few years off their life out of fright, than it's worth mentioning them.
 
So tell me if I'm understanding the problem correctly... you're in the lane with the diamond paint mark, heading from left to right. The yellow line above the diamond lane separates left-right and right-left HOV cars (so there are three lanes headed right and one lane headed left). Your car wants to head straight through the triangular no-man's land to carry on with the two lower lanes? I'm assuming you manually correct so you don't know where it would eventually take you if left to its own devices?

Almost nailed it. There are 3 lanes going in the same direction (left-to-right on the map in the top-down view), and no lanes in the other direction - those are on the other side of the median dividing the highways.

The car not only wants to head straight but actually does. Yesterday morning, I let it do just that to see what would happen (no other cars were around me) and it started turning only in recognition of the center lane, which is interesting because this meant it did read the lanes - just not the right one. I do end up correcting it since it's a fairly tight turn for highway speeds (60+ mph).

I'll try to take a video of this tomorrow morning but don't guarantee it will come out in a way that would be meaningful enough to interpret the behavior. If anyone has tips on where to mount a GoPro to properly capture this, I would highly appreciate it!
 
Read through as much of the thread as I have time for today (got due dates on my code, you see), and wanted to give a few warnings from what I've seen trip up lanekeeping/autopilot systems (specifically, the one on my Ford, which from what I've read in this thread is like a moronic older cousin of the Tesla's system). If any of these has been mentioned please disregard, my apologies as I don't have any more time to read!
Actually, I haven't seen any problems that follow what you've described... in spite of having expected your scenarios would be trouble.

The camera system seems to be quite good at recognizing paint lines and not being confused by other linear markings on the pavement. I was driving at night in the rain out in the middle of nowhere, on a divided highway last weekend. Lights from the other side were reflecting off the wet road. Some rutting was evident in the lanes and held some water. The lines were not very fresh. Autopilot followed the path correctly, although in the worst areas did pinball around a little. I actually shut it off because I didn't think I could drive as fast in these conditions as it was managing, should control be dropped back in my lap! The conditions meant I couldn't see far enough ahead to feel safe and likely explained the pinball effect as data was being considered from just off the front bumper.

My impression, thus far, is that the vision system is much better than anyone would likely expect. The issues I'm seeing are more related to the decision making part of the equation and understanding what a given set of conditions should trigger as a response.

- - - Updated - - -

Almost nailed it. There are 3 lanes going in the same direction (left-to-right on the map in the top-down view), and no lanes in the other direction - those are on the other side of the median dividing the highways.

The car not only wants to head straight but actually does. Yesterday morning, I let it do just that to see what would happen (no other cars were around me) and it started turning only in recognition of the center lane, which is interesting because this meant it did read the lanes - just not the right one. I do end up correcting it since it's a fairly tight turn for highway speeds (60+ mph).
Gotcha. The yellow line is actually separating you from a generous paved shoulder. My experiences have suggested that the yellow line is respected more than the white fog line. And that splitting the two so as to center between white and yellow is a primary goal. So to have it leave the yellow line AND cross the white is quite remarkable. I have to wonder what the crowd sourcing data we've heard rumored actually does... and how much input GPS location and mapping has on the system (if any). If others had kept to the right I wonder if your car was taught to do the same?
 
Gotcha. The yellow line is actually separating you from a generous paved shoulder. My experiences have suggested that the yellow line is respected more than the white fog line. And that splitting the two so as to center between white and yellow is a primary goal. So to have it leave the yellow line AND cross the white is quite remarkable. I have to wonder what the crowd sourcing data we've heard rumored actually does... and how much input GPS location and mapping has on the system (if any). If others had kept to the right I wonder if your car was taught to do the same?

Good question, I wonder if others were using the carpool lane to pass other cars in the two right lanes then cut over. Doubtful, since most Teslas around here have the HOV sticker that allows them to use the carpool lane. As to why it would leave the yellow line and cross the white one, it's probably more related to the fact that the road curves right at that point. Perhaps I'm going too fast for it, hmmm....


MarcG, what is the elevation, well, more like is the road flat or does it slightly rise at that point?

I've found that when I'm cresting a hill, even slightly, it has a tendency to get confused.

It's flat at close to 0 elevation. The uphill starts further up the carpool lane when it becomes an overpass to merge on I-80E/580W, all by itself and separate from the other two lanes that will have since split off into another onramp.
 
Good question, I wonder if others were using the carpool lane to pass other cars in the two right lanes then cut over. Doubtful, since most Teslas around here have the HOV sticker that allows them to use the carpool lane. As to why it would leave the yellow line and cross the white one, it's probably more related to the fact that the road curves right at that point. Perhaps I'm going too fast for it, hmmm....




It's flat at close to 0 elevation. The uphill starts further up the carpool lane when it becomes an overpass to merge on I-80E/580W, all by itself and separate from the other two lanes that will have since split off into another onramp.

Doesn't this suggest the AP is not learning from your corrections?
 
Doesn't this suggest the AP is not learning from your corrections?

Maybe. But I also could mean that not many other Teslas are driving this route to provide more input to adjust the algorithm. Elon said they wouldn't accept just a single person's corrections in case they were being nefarious or had an emergency or accident that caused them to behave differently. In other areas where AP is improving, it's conceivable that dozens of people passed the same route and their combined inputs are tweaking AP.
 
Maybe. But I also could mean that not many other Teslas are driving this route to provide more input to adjust the algorithm. Elon said they wouldn't accept just a single person's corrections in case they were being nefarious or had an emergency or accident that caused them to behave differently. In other areas where AP is improving, it's conceivable that dozens of people passed the same route and their combined inputs are tweaking AP.

Seems to me that they ought to have AP learn from a single car for just that car, but not distribute that learning to the fleet without corroboration. That's probably a lot of extra complication and maybe not worth it, though.
 
Seems to me that they ought to have AP learn from a single car for just that car, but not distribute that learning to the fleet without corroboration. That's probably a lot of extra complication and maybe not worth it, though.

I do not disagree with you. Tesla seems to be taking a much more "big picture" approach, but I can see value in what you are saying. What that would mean however is that processing and revising of the AP data would be done in the car and they likely didn't want to put that level of workload on the car when trying to diagnose other things. Not that Tesla is doing this, but God forbid a car gets in an accident, HQ has the model to compare the sensor data of the car from, or a reasonably approximation if the car is unrecoverable. If the model changes at the car to compensate for individual inputs, it is much more difficult to analyze "faults".
 
Seems to me that they ought to have AP learn from a single car for just that car, but not distribute that learning to the fleet without corroboration. That's probably a lot of extra complication and maybe not worth it, though.

I think the goal of AP should be to have a strong, generalizable model that is applicable to the entire fleet. The processing power of the vehicle is inadequate to do any real analysis of the features. For example, I trained a tiny predictive model on my MacBook Pro which had about 50 features and 20k cases. It took over an hour to complete, and I could have ironed shirts with the laptop when I was done. The feature set of AP is much greater, and the cases are continuous.

The argument could be made that they could separately build/update/maintain individual models for all owners at HQ, but that argument falls apart pretty quickly.

I think it makes more sense to continue to take in the fleet data and allow all cars to teach each other. If the model learns something from a construction zone in New Jersey, there's no reason to learn it again when a similar situation presents itself in Texas.

I've said this a couple of times upthread, but a defining characteristic of machine learning is generalizability. That's kind of the whole point of it. Build a model that generalizes well to new data. Some of the posts in this thread seem to miss that point, thinking that a developer can make some "tweaks". The developers only change the learning algorithm, and the algorithm makes the tweaks from the data inputs.
 
If we could see the Tesla AP map, superimposed over Google "satellite view" map,
we could give a lot of good feedback and corrections to help improve the map,
or at least identify poor data.

How does this work? Some Teslas take the exit, and some do not, right?
So, what is the car "learning", just that there is an exit lane there, but
not whether to take the exit or not?

But simply looking at the "coarse" Google map data tells one that there
is an exit there. Using some pattern recognition to examine the "satelite"
view pictures (really aerial pictures, in most populated areas), one could
often extract the lane structure, and sometimes even lane markings.
Sure, a computer might have to chew on the data for weeks.

Perhaps Google will not give permission to use their data that way,
so Tesla is trying to create another world map, a lane-map?

So far, speed reduction for curves seems to be the big problem,
since auto-steering while you control the speed must be done
by using the AP control lever to reduce the TACC speed to a
low speed, while maintaining the normal freeway speeds with
the accelerator pedal. Not impossible, but a new, distracting
skill to learn.

Observing the diamond-shaped, yellow (in California), slow-warning
signs would seem to be helpful, similar to the normal speed limit signs.

Reacting to Speed Zone Ahead signs, to reduce speed BEFORE entering
a little town's 35 or 25 mph zone would help reduce income-producing
ticketing just after the Speed Zone sign.
 
Update from this morning: took the same route as usual in the HOV (carpool) lane I mentioned above and paid a lot more attention to where Autosteer has been failing.

Turns out I was wrong about the exact location, and the issue happens a little before the split I was showing in the satellite image. It actually occurs when the bend initially starts, after a straightaway, and is sharper at the entry point:

Top-down.png



It's a little hard to see from this top-down satellite image as bend is in the shade, so here's an aerial view from the north (looking south) to perhaps appreciate the bend a little better:

Aerial.png



Anyway, I engaged Autosteer and took the carpool lane as usual to see what would happen this time.

Even though it still spilled into the lane to the right of mine, it actually rejoined the carpool lane by itself without asking me to touch the steering wheel (although it did beep for a second without warnings on the IC).

AND this time I managed to capture it on video :biggrin:

 
Last edited by a moderator:
How does this work? Some Teslas take the exit, and some do not, right?
So, what is the car "learning", just that there is an exit lane there, but
not whether to take the exit or not?

This is problem I see as well. If AP is not really intended to make diversions from the main highway, then the problem becomes one of identifying where the trouble spots ON the highway are. In my examples, the issue was with AP deciding to go somewhere the highway laning didn't intend. I would think that some sort of inertial reading taken when AP is manually disengaged by steering movement would indicate a swerve. That should trigger a computer analysis of the before/after for a few hundred metres and determine if the driver was simply attempting to stay on the road, swerve around an obstacle or do something else (nefarious or otherwise). That should be enough to allow even a single car to add improvements to the mapping.

Reacting to Speed Zone Ahead signs, to reduce speed BEFORE entering
a little town's 35 or 25 mph zone would help reduce income-producing
ticketing just after the Speed Zone sign.
I've pondered exactly this problem myself as well. I don't like beginning the deceleration after entering the new, lower, speed zone. Ticket waiting to happen...

- - - Updated - - -

Even though it still spilled into the lane to the right of mine, it actually rejoined the carpool lane by itself without asking me to touch the steering wheel (although it did beep for a second without warnings on the IC).
That's very interesting... not sure why it would deviate from the lane as it doesn't look that sharp to me. I wonder if it will improve more the next time you travel this route?

I'm about to head off on the same road I've had issues with and will see if the trouble spots are still problematic. I have to wonder if it learned the last time when I tried a second low speed pass and it behaved properly... If so, the AI component of the software is really quite remarkable and might be the real jewel in the system. It would get better with time, something the other guys haven't even begun to code...!