Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking so bad I want to return my car

This site may earn commission on affiliate links.
As a person with monocular vision I continuously surprise my spouse with what I manage but it's a significant operational defect. An example of this is knowing the distance to a closed overhead door -- my adaptations don't work well in the car. The Tesla can also make that determination. Oh wait ... it uses SONAR because using the right tool for the job is often a good idea.
Oh totally, and I'm not saying monocular vision is equally adept (as you know better than most), just that binocular is not a cure-all.
 
Human vision has an excellent dynamic range but it's managed by the iris so response isn't instantaneous. Camera's can be blinded too. E.g. the repeater cameras at night given a turn signal or the pillar cameras whenever they face the sun. The specific behavior is a function of the hardware.
Very true, and its the response time to changes which makes the camera more useful in these circumstances. Look at your phone .. it can take HDR images by taking two photos very quickly with different shutter settings and synthesizing an HDR image from the two literally in the blink of an eye (actually faster than that).
 
Yeah definitely! I wonder if they have some sort of HDR, auto-brightness, or auto-ISO built-in. I'm not sure what kind of cameras Tesla use, but if image enhancement can give the algorithms any benefit or lessen the load of that learning the dynamic range within the object detection modules, it might be a possible avenue. I've done some object detection and person detection in low-light using traditional and deep learning approaches, and in most all cases, they benefit from some sort of enhancement, especially deep learning approaches.
One think I and many of the FSD testers have noticed is how good the car is at tracking lanes even in bad weather, late night, rain etc. There are times when driving in late night rain when *I* can't see the lane lines and the car is spot on. it seems almost impervious to the glare of oncoming headlights and the erasure of lane lines when the road is wet and shiny. I've no idea if this is the cameras or NN, but its probably both.
 
  • Like
Reactions: TheTheocracy
OK so - since the 2021.44.25.2 update which I received on 12/23, I have noticed a change. I have been up and down the same 4 lane road in Central TX (I attached a picture of exactly where one of the typical PB incidents happens) - and so far - it has not recurred in 3 round trips. Now, this might be because there has been less traffic the last few days, or some other reason, but thus far I have hope that they might have figured out and at least partially corrected this problem? If anyone here has any similar experience, please let me know.

View attachment 748605
Let us know if this continues to be better .. the release notes for FSD beta indicated why had improved false positives, and this might be in the vision stack that 2021.44.25.2 also uses now.
 
I would think an array of cameras (12 maybe?) surrounding the perimeter of the roof covering 360 degrees would provide all the information the car’s processor could ever want in a vision only system. Maybe one day when we have the processing power.
 
  • Like
Reactions: TheTheocracy
Here is a cool low light image enhancement video that shows you can still extract information from heavily compressed dark imagery. I can imagine Tesla not using it because it would be very computationally expensive. But maybe they can incorporate something into their feature extraction network?


If they have training data in the dark, their system may be able to pick up objects in low light, but having enough data to train and balancing the training data between day and night and all other scenarios would be tough. Some simpler algorithms, like Haar-cascades, work well in low light because it calculates a simple difference between areas, but the CNNs they use require lots of data and better training. Only if we can get explainable AI working well so the algorithms can tell us what they are seeing.

Pretty cool... now how would it deal with the same low light environment with oncoming high beam headlights?

Keith
 
  • Like
Reactions: TheTheocracy
Tesla uses the AR0132 for most of its cameras. It does support in camera HDR, but that mode may result in a lower framerate and also introduces more latency and artifacts (as the chip needs to wait for 3 exposures and process it).
https://www.mouser.com/datasheet/2/308/AR0132AT-D-888230.pdf

There are newer sensors today that have a quad bayer array or a mode called DOL-HDR (which it switches exposures line by line) and can support single frame HDR (by pixels being set to different exposures) that avoids a lot of the problems.

I've seen manufacturers compute HDR with a 12-bit camera and pump out an 'HDR-like' image composite (8-bit), but not sure the latency to process. In many cases, the composite functions much better than just having the raw 12-bit. Never heard of the DOL-HDR, that's a pretty cool concept!

One think I and many of the FSD testers have noticed is how good the car is at tracking lanes even in bad weather, late night, rain etc. There are times when driving in late night rain when *I* can't see the lane lines and the car is spot on. it seems almost impervious to the glare of oncoming headlights and the erasure of lane lines when the road is wet and shiny. I've no idea if this is the cameras or NN, but its probably both.

I have noticed as well how good it does in bad weather. Yeah my guess currently is NN, since Tesla has so much data and probably has a good amount of data in those operating conditions.

I would think an array of cameras (12 maybe?) surrounding the perimeter of the roof covering 360 degrees would provide all the information the car’s processor could ever want in a vision only system. Maybe one day when we have the processing power.

Yeah I think surrounding cameras should be enough, but depending on the people finding the solution, it may not be. Tesla having multiple front facing cameras at different FOVs (wide, main, narrow, and two side front facing) is pretty smart considering object detection and recognition is scale dependent.

Pretty cool... now how would it deal with the same low light environment with oncoming high beam headlights?

Keith

There are some algorithms to extract information in over-exposed/under-exposed images. The picture I attached is from a paper that uses the self tunable transfer function, which is an algorithm that takes neighborhood information of a pixel and normalizes it based on a set of curves. Very similar to pop/retinex filters that people use to bring out details in an image. These algorithms are much better than an HDR based image because you are using local information to enhance, not global. Headlights will definitely washout global illumination. I'm pretty sure the deep learning image enhancement algorithm I shared can be trained to not wash out the information in the blinding headlights scenario, but it just takes the right light and scenario to really over-exposed everything.

JEI_22_2_023010_f015.png


It's so crazy to think how much data is coming in and how much information the system needs to extract to accomplish FSD. I remember my senior design project for undergrad was a line follow using a camera and obstacle avoidance using ultrasonic sensors. Fairly easy using basic commands and it being a controlled environment, but never thought I would be able to try FSD on a car so soon. So cool!

Sorry if there is a lot of technical jargon. Computer vision is my thing.
 
Yeah I think surrounding cameras should be enough, but depending on the people finding the solution, it may not be. Tesla having multiple front facing cameras at different FOVs (wide, main, narrow, and two side front facing) is pretty smart considering object detection and recognition is scale dependent.
There has been some (sensible) discussion about side-facing cameras mounted close to the headlights so the car can see left/right alone intersections without having to creep forward into intersections (better then even we can, since they are several feet ahead of the seating position).
 
There has been some (sensible) discussion about side-facing cameras mounted close to the headlights so the car can see left/right alone intersections without having to creep forward into intersections (better then even we can, since they are several feet ahead of the seating position).

It is so nice to have sensible discussions... this thread is pleasant and respectful without people backing away from their thoughts to avoid bullying now. I have learned a lot about camera capabilities in the last few days that make me hopeful for an eventual night time driving solution.

Please excuse the off topic thought :)

Keith
 
I ran prob 300 miles across TACC / FSD beta this weekend with some unexpected complications. It seems as though some FSD beta items spill over into TACC even though one would expect they would not. ie, TACC stopping for lights and stop signs even though that item is switched off in preferences. Also, a lot of “phantom braking” I was experiencing was actually the car choosing to slow down because according to the NAV I was about to “miss a turn”. This kind of thing is definitely new to me, I daily set NAV destinations only to get ETA and arrival SOC but I do not necessarily follow the route on the NAV. As such I choose to ignore an upcoming turn on the NAV and without autopilot - strictly TACC - the car would slow down to 15mph or so as I approached and passed a turn in the NAV. it took me a bit to figure out it was related to the expected turn in the NAV. I hope this isn’t future functionality, I might expect that to happen on autopilot but with TACC???

Anyway after a weekend of FSD beta I switched it off to basically operate as a vision only / non FSD car. This does make a difference in behavior across the board (won’t get into it since this isn’t an FSD thread).

I’ve already identified some intersections, conditions and roads where phantom braking is bad that are 100% reproducible. Need some more time to evaluate as day/night matters. This morning was my first commute back to work and one section of road in the dark was ridiculous with the hesitation and phantom braking. I would get minor to moderate slowdowns pretty much every 30 seconds. Other sections were oddly problem free even though they are all two lane roads.

For the record, I consider any time the car slows down for no reason to be “phantom braking”. IMO it does not matter why the car slowed down, only that it did it for no real hazard.

I considered this morning while this repetitive hesitation and braking was going on with TACC that IMO these issues are contradictory: on the one hand the “braking” is a safety feature, but on the other hand you are warned that auto driving features are for “assistance“ and you still must pay attention to the road. Why bother with “minor” slowdowns for questionable situations if the driver should see and react to these things? because in theory they are paying as much attention to the road as the computer is? The car should really only react to immediate and significant hazards, not whatever minor thing it is seeing where it drops only a few mph.
 
I considered this morning while this repetitive hesitation and braking was going on with TACC that IMO these issues are contradictory: on the one hand the “braking” is a safety feature, but on the other hand you are warned that auto driving features are for “assistance“ and you still must pay attention to the road. Why bother with “minor” slowdowns for questionable situations if the driver should see and react to these things? because in theory they are paying as much attention to the road as the computer is? The car should really only react to immediate and significant hazards, not whatever minor thing it is seeing where it drops only a few mph.
I addressed a similar point here, but the few mph does make a difference in terms of perception. In terms of cost/benefit, they would rather have phantom slowing than have another accident where someone may run to the media and complain about AP causing the crash (then have NHTSA on their backs).
In terms of the effect on damage in an impact maybe 5 mph doesn't, but in terms of reaction time, it does make a significant difference, especially if the car is at the edge of its perception. A drop from 65 mph to 60 mph can give the car 2-3 more frames (given 36 fps camera pipeline) before reaching a target, which can make a huge difference.

It's the same reason why Vision is still at 80mph max vs 90 mph, that 10mph does make a difference.
 
I took my first real drive with 2021.44.25 (daytime only, overcast, no shadows) and it might be better or it might be worse. Right off I had sudden braking (58 to 54 mph) because of a rubbish bin at the side of the road. That only happened once so it might be a glitch. On two-lane roads in a right curve any adjacent traffic on the left causes mild braking. This is largely unchanged. Any oncoming traffic cresting a rise even a very slight rise causes moderate braking. This seems worse. I had only a single mild braking incident on I-90 when I was passed on the left at 85 mph. Just the once so that might also be a glitch.

So in my circumstance it remains annoying on two-lane roads of any width if there's any other traffic. My spouse would never use TACC because there are too many rules. By the way some might think slowing from 58 to 54 mph is no big deal and it isn't unless it happens in less than second and there's a klaxon going off in the cabin.
 
  • Like
Reactions: shoemakj
I learned my lesson the first time. I don't use TACC if anyone else is in the car.

And that is all the condemnation a technology needs for anyone with any common sense to know that it is not ready for prime time. When you have GOOD tech you want to show it off to people. If my Galaxy Fold phone had broken at the hinge a week after I purchased it (and there was no ability to return it) I would never have shown it to anyone... as is I bust out the phone and open it up anticipating the questions about my cool tech. The only people not impressed are apple fanboys :D

Keith
 
Just to continue reporting on going from radar to vision via FSD beta (all of this is TACC only, not autosteer or FSD beta engaged):

at this point I haven't had any real trouble with phantom braking. Almost all of my testing so far has been at night / day + heavy rain. At night I am on two lane roads that have little to no lighting, so it is pretty dark. The car has mostly handled that well, with only a few slowdowns or braking incidents. I am actually surprised, that is much better than I would have expected.

I have had more trouble during the day however even that has been much better than I expected. I wonder if they just improved this for the firmware when I got moved into the beta. Still, I believe that more than one braking incident a day is too much. There are not that many hazards on any given day, I would expect you should know something is wrong with your software if on average people are experiencing many "hazards" on every drive.

One annoying thing is this weird hesitation to assume the desired speed. this happens after a "phantom slowdown" or sometimes simply right after I activate TACC. The car may just sit there barely accelerating +1mph for a long time, then at some point it decides it is safe to accelerate or something. I would prefer a set amount of acceleration instead of this strange and inconsistent rate I have been getting. I have a 2014 Ford PHEV that always does "two bars" of acceleration regardless of when you activate cruise. its very predictable.

I have had one 100% repeatable incident with skidmark on the road that the car is interpreting as a person long enough to smash the brakes. You can see the "person" the car thinks is there on screen visualization. This would be funny if I hadn't already seen an older video of someone else's vision tesla doing this, meaning that tesla still has not fixed it.

IMO the fact that the car is reacting to a "person" is part of the problem: TACC should only be concerned with detecting a car in front of you or not in order to adjust speed. No car object detected in front of you? then maintain cruise speed. period. Certainly extra safety features will be welcome when *they work* .. but while they are not working properly, they should not be included or that feature should be optional. Hell, I am even game to help Tesla refine this. I would test it regularly to see if it worked and report problems, *as long as I could switch back out to reliable cruise when I need it*. I am not against the extra safety features. They just can't come at the expense of regular driving.


I still intend on doing more two lane roads / extreme shadows + trucks but that is harder to set up because there are only certain times of day + driving direction I can do that with. We aren't really getting sun right now .. however I have already done a little of this (about 20 miles worth) and so far I didn't experience any issues with long shadows on my side of the road. so that's good as well.
 
IMO the fact that the car is reacting to a "person" is part of the problem: TACC should only be concerned with detecting a car in front of you or not in order to adjust speed. No car object detected in front of you? then maintain cruise speed. period. Certainly extra safety features will be welcome when *they work* .. but while they are not working properly, they should not be included or that feature should be optional. Hell, I am even game to help Tesla refine this. I would test it regularly to see if it worked and report problems, *as long as I could switch back out to reliable cruise when I need it*. I am not against the extra safety features. They just can't come at the expense of regular driving.
I could not agree with you more. That is why much simpler (and usually radar-based) ACC systems other automakers use work better than TACC these days. They're just trying to do one very specific thing. Modern TACC is built on software that's trying to do too much, it's not good enough at all of it, and the resulting behavior is not as predictable or trustworthy for the driver.

FSD should attempt to detect and stop for humans in the road of course, but not plain TACC. I don't need or want TACC attempting to detect anything that's not a vehicle. It's just a driver assist system, I'm still paying attention, I just want it to relieve my right leg of the physical pain from holding the Go pedal on long highway drives.
 
  • Like
Reactions: rexki