Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

60 mph; excellent road markings and AP2 tried to throw the car at the median

This site may earn commission on affiliate links.
@Moi? That's ok, I can use a good poking every so often.
Haha, yea, it's all in good fun! :p
I will say, thanks to your post, I did a bunch of research on Audi's driver assist features, and was quite impressed! (At least on paper, as I've never driven with any of them.). It did beg to ask the question why we don't see or hear much about the Audi system. If it truly delivers everything I read, I'm actually super jealous!
 
AP1 exhibits similar behavior when the sun shines just right, except where it has good GPS data (i.e. it knows many other Tesla's when driving through this exact spot, did not follow the tar line).
Note that AP2 has two forward looking (currently used) cameras so it should be harder for it to be sun-blinded than AP1, that said it probably can be verified on AP2 if there's a point where both cams cannot see anything due to glare/whatever, just need some nice rotating and tilting platform.
 
Just drove 500 miles with a AP2 car and had this happen twice. It reminds of the way early AP1 days when it'd start taking an off ramp. The first time the AP2 car did it it really caught me by suprise. I've been driving 10s of thousands of miles on AP1 and I know it's not perfect and scenarios to look out for. But these two incidents on AP2 where totally out of the blue. I still used it for 70% of my trip but it made me worried looking down for a second or two where in good old AP1 it is quite reliable.
 
Note that AP2 has two forward looking (currently used) cameras so it should be harder for it to be sun-blinded than AP1, that said it probably can be verified on AP2 if there's a point where both cams cannot see anything due to glare/whatever, just need some nice rotating and tilting platform.
The problem I was commenting on was not the camera being blind, but the tar line on the road (recently fixed crack for example, nice shiny asphalt line) is mistaken by the AP for a lane marker, possibly because at just the right angle it reflects more sunshine than the lane market itself. Humans can see that the tar line leads into the median and deduce it's not a lane. Another way to put it, what if someone spilled white paint and it looked like a lane marking leading straight into the ditch. Humans know ditches are bad for driving, AI may not be smart enough to recognize a ditch as a hazard (if it was, no lane markers would be needed next to ditch).
 
The problem I was commenting on was not the camera being blind, but the tar line on the road (recently fixed crack for example, nice shiny asphalt line) is mistaken by the AP for a lane marker, possibly because at just the right angle it reflects more sunshine than the lane market itself. Humans can see that the tar line leads into the median and deduce it's not a lane. Another way to put it, what if someone spilled white paint and it looked like a lane marking leading straight into the ditch. Humans know ditches are bad for driving, AI may not be smart enough to recognize a ditch as a hazard (if it was, no lane markers would be needed next to ditch).
Ah! That makes sense then.
Theoretically, a sun-reflecting tar line should be much more bright compared to a painted white line, and AP2 camera should have enough dynamic range to see the difference. Spilled white paint (that was then spread by tires) would pose a much bigger problem of course.
 
I don't understand these errors .

Why isn't the car monitoring multiple lane markings and using some form of average? Or lane markings plus the median?

Lord knows there are enough sensors/cameras on these vehicles and if a tar line (or whatever) veers off in a new direction there are plenty of other road features which don't.
 
I don't understand these errors .

Why isn't the car monitoring multiple lane markings and using some form of average? Or lane markings plus the median?

Lord knows there are enough sensors/cameras on these vehicles and if a tar line (or whatever) veers off in a new direction there are plenty of other road features which don't.
Consider that the veering off line IS the correct one potentially (construction zone temporary marking as in that case where Tesla hit a construction barrier in Texas not too long ago Video shows Tesla Model S slamming into a wall while driving on Autopilot )?
A solid line might be given more priority than a dotted/interrupted one for all we know.
 
Consider that the veering off line IS the correct one potentially (construction zone temporary marking as in that case where Tesla hit a construction barrier in Texas not too long ago Video shows Tesla Model S slamming into a wall while driving on Autopilot )?
A solid line might be given more priority than a dotted/interrupted one for all we know.

These are two very different situations (at least to our eyes/brains)

In the case of a single line on the surface of the road, there will always be other lines / road edges / medians or whatever to confirm or contradict the direction of that single line.

But in the case of an object blocking the immediate path of the vehicle then this let's-take-an-average-of-all-road-elements approach is clearly wrong.

Detecting sudden events with moving objects seems to be pretty good - here are a few:

But give it a stationary object in lane and it doesn't react, I guess for fear of false positives.

And weird marks on the surface of roads seems to be an issue from time to time and maybe they are difficult to tell from actual objects. Here are some extreme examples to fool level 5 humans:

Anamorphic Illusion Car Park

Slowing Traffic With Trompe l'Oeil

Friday Video: Canadians Get Fancy, Use Trompe L'oeil Children As Speedbumps

blue_exit-min.jpg
 
I'm dismayed to keep reading accounts like this. If this flaw isn't corrected fast, it seems like a mere matter of time before someone captures a serious enough crash with a dashcam to get the Feds' attention and lots of negative press.

You don't need dashcam video (although I would highly recommend one). There's no video at all of Joshua Brown's death while using AP1.0 and it was investigated by the Feds.

Given all the AP2.0 close calls we are hearing about, it might be only a matter of time until we hear of one. Hopefully, in the interim it is saving lives and everyone who uses it pays constant attention, although that seems unlikely.

I will say again - I am not trashing Tesla.

It sounded to me like you just reported the facts.
 
The problem I was commenting on was not the camera being blind, but the tar line on the road (recently fixed crack for example, nice shiny asphalt line) is mistaken by the AP for a lane marker, possibly because at just the right angle it reflects more sunshine than the lane market itself. Humans can see that the tar line leads into the median and deduce it's not a lane.

My AP2 car has the same problem, but only since one of the latest versions - can't remember exactly when; was some time in July IIRC.

So it's "just" an software issue IMHO.
 
  • Funny
Reactions: NerdUno
I think AP1 handles tar lines or random old faint paintings well. It seems to have some built in inertia, so that it first tries to find lines in the same area, where the line previously was. It does not right away jump scanning a new sector for lines, and if there is a line marking in the same area where marking was previously, a tar line with possibly more contrast does not override the right line.
 
Last edited:
Tar lines depend on sun position and glare strength from what I've experienced.

Just yesterday, at 70 MPH, I told my wife, hold on, here comes some tar lines. Sure enough, at speed, the car bounced around in the lane like it was brain dead. This happened again about a mile own the road. Finally turned AS off as one would have had me crossing off road into the shoulder.

I don't know who writes an algorithm that truly doesn't take into account that roads do not zigzag every few feet, especially when there are perfect lane markers and a straight road segment. Makes no sense at all.

Reported via voice feedback, but not yet followed up with E-Mail.
 
AnxietyRanger said:
Yes, this happens. AP2 - two versions ago - almost threw me into concrete blocks as well when it started following a vague, black tar-line on the road instead of beautiful, newly painted white lines on both sides of the lane.
AP1 exhibits similar behavior when the sun shines just right, except where it has good GPS data (i.e. it knows many other Tesla's when driving through this exact spot, did not follow the tar line).

To be clear, that AP2 incident was on an overcast day, good visibility but no direct sunshine.

Interestingly I was driving AP2 today after a break and had to turn it off soon, because it just kept turning towards the lane on the left, kind of seeking to sneak over there all the time. It was pretty much useless. That said, no ghost brakings this time. :)
 
It is frustrating as many have observed. I have a 2015 AP1 car and the autopilot is excellent, with rare issues. I recently had a 2017 AP2 loaner with about 1700 miles on the vehicle, latest software, etc. Very uncomfortable to use the auto pilot. I tried a lane change and it was very jerky in general. In one instance I let go of turn signal stalk a bit early and the car threw me HARD back into my original lane. I had to intervene as it did not feel like it was going to stay on the road. The challenge is the that the AP2 felt bad compared to my AP1, not just different. It was dramatic enough that I am going try and extend my lease to avoid being "forced" to AP2 before they get it dialed.

Crazy secondary question, possibly brought up by others, but is this an issue of inadequate software or was the original MobileEye just that much better??
 
That’s just it — no one is writing an “algorithm.”

Over dependence on machine learning can lead to idiotic behavior like this when the training set is insufficient or processing speed requires that it be excessively culled.

I’ve said this thing needs a hand-coded nanny to filter out stupid nonsense moves (eg swerving left/right in ways roads never do) but then that would negatively impact the ML data gathering....

I don't know who writes an algorithm that truly doesn't take into account that roads do not zigzag every few feet, especially when there are perfect lane markers and a straight road segment. Makes no sense at all.

Reported via voice feedback, but not yet followed up with E-Mail.
 
Crazy secondary question, possibly brought up by others, but is this an issue of inadequate software or was the original MobileEye just that much better??
My understanding is that Mobileye's EyeQ3 chip was handling all of AP1's image processing in hardware (lane marker identification, signs, etc.) and passing that data along to Tesla's software that was fusing said data with data from the radar and ultrasonics in order to control the vehicle. With AP2, Tesla is having to develop their own image processing software to function in place of Mobileye hardware and it appears to be proving more difficult than they thought.
 
  • Like
Reactions: NerdUno and Matias