Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Dallas . Ft. Worth - Amarillo - Clayton is the *main* corridor for Dallas - Ft. Worth to Denver. Hundreds of Tesla's drive that stretch weekly.. Millions of people on each end. if that doesnt provide sufficient data... then NN won't work.
I agree that that route is a classic example of evaluating and resolving phantom braking but that route is absolutely one of the most desolating one if not the most. Dead towns, ghost villages etc. It is no wonder the Phantom shows up there.
 
  • Funny
Reactions: sleepydoc
What? A mirage is a visual phenomenon. Train the network to ignore that visual and Bob's Your Uncle.
Obviously it is not as easy as it seems and that to the camera's perception it looks too much like oncoming or stopped vehicles, such that if you downweight these examples as negatives, it starts failing to detect actual vehicles and makes a real safety problem with crashes, there's always an AUC or precision/recall tradeoff. Precision = how many of these alerts were really true positive physical obstructions; recall = what fraction of positive class true alerts do I detect at my sensitivity threshold.

Of course a better ML model improves both simultaneously but that's in fact the whole point of machine learning problems in this space: everything is difficult and you get to a point you can't do better with the data you have regardless of the algorithm.

They'd rather have phantom braking than crashes so the sensitivity is tuned that way to enhance recall (don't crash!) which inevitably results in more false positives in borderline cases. They don't look borderline to us but they do at the decision layer of the nets.
 
  • Like
Reactions: sleepydoc
Obviously it is not as easy as it seems and that to the camera's perception it looks too much like oncoming or stopped vehicles, such that if you downweight these examples as negatives, it starts failing to detect actual vehicles
I understand the network phenomenon, but nobody confuses mirages with oncoming or stopped vehicles. Why would a perception system?

Actually, do we have any idea why the car brakes for heat shimmer on road surfaces? Have people seen phantom cars displayed in the visualization? I wonder if this is something like the car thinking that the road stops at the shimmer edge. Heck, my car hesitates/slows when going over a rise in the road where it can't see the other side. Is this a high speed equivalent of that?
 
  • Like
Reactions: texas_star_TM3
What? A mirage is a visual phenomenon. Train the network to ignore that visual and Bob's Your Uncle.
sounds like a liability issue .... camera: "there's an object on the road" ... NN: "ignore that input, it's just a mirage" .... *BOOM* radar/ Lidar is pretty black/white in that scenario.... don't get the excuse with sensor fusion either. If the cameras detect an object but there's not radar/ Lidar signature... then ignore the cameras. If both detect the object then brake. If only radar/Lidar detect something but the cameras not... i would still want the brakes applied. Only using vision and ignoring any other sensor aside from whatever NN recommends is nuts.
 
Potentially but radar is still better to eliminate these false positives.

Also the existing HW3 cameras are not very good---human eyes in fovea are significantly better, with better autofocus and color perception, and less glare sensitivity.

Straight vertically undulating roads with no cars, like mirages, also trigger the errors. The resolution isn't good enough far down the road, where humans can see there's nothing there, and use their brains knowing that there is no hidden car in the dip stopped.


And this is why even ML heavy auto driving companies still want additional sensors. HW4 has better cameras too. But more bits in the cameras means much higher compute burden and needs more expensive chips and higher power consumption.
I frankly don't think more camera resolution is going to make or break FSD.
NN is about behavior not perfection of imaging.

However, radar may need adding back to cover the poor visibility issues of weather, snow, rain etc.
That appears to have happened on the HW4 for MS and MX, but definitely NOT on MY with HW4.

All in all, unless FSD works with most of the HWxxx installed already, it will be massively expensive for Tesla to accommodate everyone that bought FSD but have older HWxxx.
Also, robotaxi cost will be affected. Part of the advantage of going to NN is avoiding more HW and proc requirements

Elon has shifted gears several times already and he won't hesitate to again if it means FSD success. So we all find out together.

And the thing is, nobody knows what the final requirements need to be (including the developers), so 'people wanting more sensors' is just emotion talking.
 
  • Like
Reactions: ChiefRollo
sounds like a liability issue .... camera: "there's an object on the road" ... NN: "ignore that input, it's just a mirage" .... *BOOM* radar/ Lidar is pretty black/white in that scenario.... don't get the excuse with sensor fusion either. If the cameras detect an object but there's not radar/ Lidar signature... then ignore the cameras. If both detect the object then brake. If only radar/Lidar detect something but the cameras not... i would still want the brakes applied. Only using vision and ignoring any other sensor aside from whatever NN recommends is nuts.
Except that presumes radar has no false negatives, which is also not true.
This sounds like a troll, but I can't guess at your tone, so I'll try again. Trees are ignored. Buildings are ignored. Marks on the road are ignored. Ideally, leaves blowing across the road should be ignored. Heck, even potholes are ignored. I see no reason why something else that is of no consequence shouldn't be ignored.


As soon as you understand the point, you will.
no, no trolling. Perhaps a bit of sarcasm, but I can’t help that. It’s genetic. Or geriatric - I get them mixed up.

Trees, buildings, etc are not ignored - but they are outside of the vehicle path and thus inconsequential. Marks on the road are also not ignored, they’re recognized as part of the road and processed accordingly. (We have actually seen visual processing mistakes with lines, too.) Mirages appear as large objects on the road = something you don’t want to hit and are treated accordingly. ‘Just ignoring them’ is not that simple because what you’re suggesting the system do is ‘not see’ something it’s identified as an object. What you’re unwittingly saying is ‘just teach the system to recognize mirages as mirages.’ But that’s the rub. Or Uncle Bob. If tesla could do that we wouldn’t have an issue.
 
Marks on the road are also not ignored, they’re recognized as part of the road and processed accordingly.
Semantics. There's stuff in the driving space that the vehicle doesn't categorize, nor does it affect the car's behavior. That's what heat shimmer is. A road with heat shimmer is a funny looking road, nothing more. It's like a concrete road versus an asphalt road, or a road with water on it, or a road with patched seam lines, or countless other variations of road surfaces. I have to believe that Tesla just didn't train the vision system to deal with it.
 
Semantics. There's stuff in the driving space that the vehicle doesn't categorize, nor does it affect the car's behavior. That's what heat shimmer is. A road with heat shimmer is a funny looking road, nothing more. It's like a concrete road versus an asphalt road, or a road with water on it, or a road with patched seam lines, or countless other variations of road surfaces. I have to believe that Tesla just didn't train the vision system to deal with it.
It’s not semantics at all. Ignoring the fact that road lines affect the car’s behavior, by definition a mirage is something that appears to be there when it isn’t. Like I said you’re saying “all Tesla needs to do is teach the system to recognize a mirage.” Well, yeah. You’re just re-stating the problem.
 
It’s not semantics at all. Ignoring the fact that road lines affect the car’s behavior, by definition a mirage is something that appears to be there when it isn’t. Like I said you’re saying “all Tesla needs to do is teach the system to recognize a mirage.” Well, yeah. You’re just re-stating the problem.
Let me put it this way. A puddle of water generates a mirage. The car drives right through them without a hitch. Treat heat shimmers like puddles of water.
 
I understand the network phenomenon, but nobody confuses mirages with oncoming or stopped vehicles. Why would a perception system?

Because human eyes are better than the current HW3 cameras in daytime, humans have two eyes, and human thinking is better, and humans understand context better.

Now there is no theoretical barrier---they could use significantly better cameras, more compute, and even geotagging special cases that make mirages more frequent and then spending effort to gather this data. And they could add back in a radar---which is how others do it to increase security of classification in these circumstances.

Actually, do we have any idea why the car brakes for heat shimmer on road surfaces? Have people seen phantom cars displayed in the visualization? I wonder if this is something like the car thinking that the road stops at the shimmer edge. Heck, my car hesitates/slows when going over a rise in the road where it can't see the other side. Is this a high speed equivalent of that?
These are all the same phenomenon---the system cannot project the vanishing point of the road surface forward in a consistent way and it gets nervous. Part of the algorithm is determining drivable road surface, it's a major component of their system.
 
  • Subaru: In 2019, Subaru announced a recall of more than 1.3 million vehicles due a system problem that could cause the AEB to activate unexpectedly. The recall affected several models, including the Forester, Outback and Crosstrek.
  • Toyota: Toyota has issued several recalls related to AEB system issues in various models, including the Camry and the RAV4. In some cases, the AEB system failed to activate when it should have, while in other cases, it activated unexpectedly.
  • Ford: In 2020, Ford issued a recall of 600,000-plus vehicles because the AEB system activated randomly and without cause. The recall affected several models, including the Fusion, Edge and Lincoln MKX.
  • American Honda: In 2019, Honda recalled more than 118,000 CR-V models due to a software issue that could cause the AEB system to engage unexpectedly.
  • General Motors: GM has also issued several recalls related to incidences when the AEB system either unexpectedly activated or failed to activate when needed in various models, including the Chevrolet Malibu, GMC Acadia and Cadillac XT5.
🍿🍿🍿🍿🍿🍿🍿🍿🍿
 
sounds like a liability issue .... camera: "there's an object on the road" ... NN: "ignore that input, it's just a mirage" .... *BOOM* radar/ Lidar is pretty black/white in that scenario.... don't get the excuse with sensor fusion either. If the cameras detect an object but there's not radar/ Lidar signature... then ignore the cameras. If both detect the object then brake. If only radar/Lidar detect something but the cameras not... i would still want the brakes applied. Only using vision and ignoring any other sensor aside from whatever NN recommends is nuts.
This assumes radar has no false negatives, which is dead wrong. I still remember when Tesla had to do the white listing to ignore the return from overhead road signs. And radar based AP definitely still had phantom braking, just the triggers were different.

From the forum hysteria here, I expected Vision to be horrible in phantom braking, but so far I haven't noticed a significant difference and I'm still on the old 2022 version of Vision. I'm in California, however, where Teslas tend to do the best, although I have driven Vision AP in a variety of roads: regular highways, high speed two lane roads (I5), as well as winding single lane roads (state routes that are not divided but essentially controlled access).

The only difference I noticed is there are a few new trigger points where the car will adjust the set speed to 65mph (even when road is straight), but it's easy to accelerator override.
 
  • Subaru: In 2019, Subaru announced a recall of more than 1.3 million vehicles due a system problem that could cause the AEB to activate unexpectedly. The recall affected several models, including the Forester, Outback and Crosstrek.
  • Toyota: Toyota has issued several recalls related to AEB system issues in various models, including the Camry and the RAV4. In some cases, the AEB system failed to activate when it should have, while in other cases, it activated unexpectedly.
  • Ford: In 2020, Ford issued a recall of 600,000-plus vehicles because the AEB system activated randomly and without cause. The recall affected several models, including the Fusion, Edge and Lincoln MKX.
  • American Honda: In 2019, Honda recalled more than 118,000 CR-V models due to a software issue that could cause the AEB system to engage unexpectedly.
  • General Motors: GM has also issued several recalls related to incidences when the AEB system either unexpectedly activated or failed to activate when needed in various models, including the Chevrolet Malibu, GMC Acadia and Cadillac XT5.
🍿🍿🍿🍿🍿🍿🍿🍿🍿
Most PB is not AEB, it's faulty vision analysis for TACC.

How many of those cars have issues with their adaptive cruise control? I don't think I've ever had a false AEB event in my MY but PB has often made TACC all but unusable. There was one period when I would have a PB event about every other minute. Ask how many Tesla drivers drive with their foot over the accelerator while using TACC then ask them if they've ever had to do that when using cruise control on any other car.
 
Most PB is not AEB, it's faulty vision analysis for TACC.

How many of those cars have issues with their adaptive cruise control? I don't think I've ever had a false AEB event in my MY but PB has often made TACC all but unusable. There was one period when I would have a PB event about every other minute. Ask how many Tesla drivers drive with their foot over the accelerator while using TACC then ask them if they've ever had to do that when using cruise control on any other car.
Exactly this. 100%

Auto Emergency Brake takes the car from highway speeds to ZERO (or close to zero). This is different from Phantom Braking which takes you from 80 mph down to 50 mph within seconds. I’ve never experienced AEB applied incorrectly…. Yet you can have multiple PB in an hour here in Texas on certain highways. On TACC … don’t even have to use AP. Tesla is the only brand I’ve driven where TACC has this bug and makes it unusable on certain highways…
 
maybe not disable radar/ remove radar sensors.... again.... radar and lidar isn't fooled by optical illusions. the amount of excuses folks make here for tesla going "vision only" is stunning...
as well, the amount of emotion talking is also stunning.
no one....NO ONE...on this or other threads is actually involved in writing Tesla's code, or GM's Blue Cruise, or Ford's whatever.
so demanding sensors of any type really is reaching.
 
  • Like
Reactions: Dewg and JB47394
as well, the amount of emotion talking is also stunning.
no one....NO ONE...on this or other threads is actually involved in writing Tesla's code, or GM's Blue Cruise, or Ford's whatever.
so demanding sensors of any type really is reaching.
i didnt have PBs to any noticeable degree *before* Tesla deleted the radar sensor input in my car and forced it to "vision only".
Driven the same route, same time of year, same weather multiple times before/after...
 
i didnt have PBs to any noticeable degree *before* Tesla deleted the radar sensor input in my car and forced it to "vision only".
Driven the same route, same time of year, same weather multiple times before/after...
I've said this before, but it's not a radar or vision issue. My sister's 2012 Prius had adaptive cruise that used radar and it never had phantom braking. My son's Subaru Forester uses vision. It also has phantom braking. It's possible to implement adaptive cruise using either modality without the phantom braking issues Tesla has.
 
I've said this before, but it's not a radar or vision issue. My sister's 2012 Prius had adaptive cruise that used radar and it never had phantom braking. My son's Subaru Forester uses vision. It also has phantom braking. It's possible to implement adaptive cruise using either modality without the phantom braking issues Tesla has.
wait... so the prius with radar doesnt have PB but the Subaru with vision only has PB ? isn't that exactly my point? confused...