Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AEB Won’t Prevent an Accident

This site may earn commission on affiliate links.
Thanks Buster1 for telling me about the "Doppler Notch", that's a good descriptive term for what's going on.

The Tesla radar has a problem with roads that are strongly curved or steep hills. What is needed next (after stopping
reliably when confronted with stopped cars 8) is the ability to steer the beams with phasing to keep the focus on the path ahead.

In the end I think Musk is correct that 4-way stereo vision is the way to go. I have driven for hours in blinding snow storms ..tiring but can be done. Lidar is probably useless in snow and rain.

If the cameras are good enough and there's enough crunch in the car to process it, then vision should be better (someday)

Lidar/Radar is a crutch ....if the weather's bad and there's a person/deer/crate in the road still you need to stop.
 
  • Like
Reactions: Buster1
Thanks Buster1 for telling me about the "Doppler Notch", that's a good descriptive term for what's going on.

The Tesla radar has a problem with roads that are strongly curved or steep hills. What is needed next (after stopping
reliably when confronted with stopped cars 8) is the ability to steer the beams with phasing to keep the focus on the path ahead.

In the end I think Musk is correct that 4-way stereo vision is the way to go. I have driven for hours in blinding snow storms ..tiring but can be done. Lidar is probably useless in snow and rain.

If the cameras are good enough and there's enough crunch in the car to process it, then vision should be better (someday)

Lidar/Radar is a crutch ....if the weather's bad and there's a person/deer/crate in the road still you need to stop.
Passive optical sensors (including cameras) don't directly measure speed or distance. Spaced sensors (e.g. stereo cameras) can compare results and estimate distance if not too far away, like our eyes allow our brains to do. They deduce speed by observing how quickly the size of an object appears to be growing or shrinking.

Active sensors such as radar and lidar, though, directly measure distance and often also relative speed. They can be designed to be very accurate and yield results quickly. That is a real advantage at highway speeds when slow-moving or even stopped traffic is ahead.

However, radar and lidar don't detect the colors of things, and they operate at longer wavelengths than visible light. Passive optical can literally "see" lane markings, whether a traffic light is red, etc.. Plus the shorter wavelengths of visible light make it easier to distinguish what kind of object is being observed, compared with radar, whose radio waves have much longer wavelengths.

So neither today's passive nor active sensors offer everything needed. Their capabilities complement each other. That is why we see vision plus either radar or lidar (sometimes both) being used.

Even when Elon Musk talks about his ultimate "global maximum" solution of applying AI to vision, he still includes radar. Between radar and lidar, he argues for radar because their signals use longer wavelengths than lidar, and so they penetrate bad weather better. So his preference is vision and radar, rather than vision and lidar. He never talks about doing everything with vision alone, even using AI.
 
Last edited:
  • Like
Reactions: kugasman
You are wrong. The physics for the radars you describe and those for automotive radars are the same, but the implementations for automotive use are not as you imagine. For proof, see the following illustration from here http://cdn.euroncap.com/media/1384/...13-0-a837f165-3a9c-4a94-aefb-1171fbf21213.pdf . These vehicles all responded to a stopped object ahead, including those with radar. Also, my own experience since the 1990s with automotive radar is that no automotive radar system ignores stationary objects in the expected path. To reduce false alarms, many of them wait after first detection for more confidence the object is not overhead or outside the lane, but even those react before collision. That's what defines today's "collision imminent braking" which very often uses radar. Also, here in the U.S., the Insurance Institute for Highway Safety tests vehicles with automatic emergency braking, including for braking for a stopped vehicle ahead, and many of those systems are radar-based -- and their tests show they work.
DUmhPpQVMAE3sMn.jpg
Word.

The 'aircraft pilot' above notes notch filtering, but his knowledge doesn't extend to point clouds for some reason. (At least he should have seen Black Mirror's, Metalhead)

With version 8 Tesla added the updated Bosch drivers which greatly enhance the radar's capabilities. It sees things in a 3D representation, but not only that, it compares successive images so that if a beer can's concave bottom reflects strongly in one image, it is far less in the next.

Sure the radar has to ignore the world rushing toward it. But it also sees what is directly in front of it, and that's the main thing that matters. What is directly in front of you. It is beyond me why AEB isn't better. Some say that it can't know if we are about to swerve, but if we do (or brake) it immediately disengages automatic control. Jason's accident wasn't preventable with version 7, but these recent accidents should have been with v8.

And as to the poster above who referenced "CID" -- that is the Tegra processor VCM daughtercard which handles all in-car functions and mediates between the MobileEye (AP1) and CAN busses. It's what we 'root'.

Don't make excuses for not having better AEB. Other car makers manage to do it. This is something we need to have.

If AEB causes the car behind to to hit you, I'm sorry that life isn't a bucket of roses. The law makes that their fault.


Passive optical sensors (including cameras) don't directly measure speed or distance. Spaced sensors (e.g. stereo cameras) can compare results and estimate distance if not too far away, like our eyes allow our brains to do. They deduce speed by observing how quickly the size of an object appears to be growing or shrinking.

Right again. Tesla AI looks for things like tail lights in front of you growing farther apart (indicating of course, you're getting closer... not that the car in front of you is growing to gigantic size). We as humans, don't remember learning this a long time ago, but we did.

BTW, there is no doppler notch. There is the aircraft notch maneuver, which is a whole nother pile of bananas. Now, notch filtering refers to allowing a particular signal to the exclusion of all others.
 
Last edited:
Don't make excuses for not having better AEB. Other car makers manage to do it. This is something we need to have.

People keep saying this but what system do you prefer? I don't even think people know the limitations of the other systems. Two of the widely accepted best are Volvo's and Subaru's. Volvo's AEB shuts off at speeds greater than ~50 mph. Subaru's system won't activate if the speed differential is greater than ~30mph. NO system is designed to stop a car from running into a stationary object at 60 mph.

The technology just isn't there yet.
 
Don't make excuses for not having better AEB. Other car makers manage to do it.

Please tell us which makers are doing it.


Certainly not BMW.

Driving Assistant combines the camera-based systems Lane Departure Warning and Collision Warning. The Approach and Pedestrian Warning with City Brake Activation warns of collisions with vehicles or pedestrians at speeds of between 10 and 60 km/h (6 and 37mph) and brakes in an emergency.
BMW ConnectedDrive : Driver Assistance
 
Last edited:
People keep saying this but what system do you prefer? I don't even think people know the limitations of the other systems. Two of the widely accepted best are Volvo's and Subaru's. Volvo's AEB shuts off at speeds greater than ~50 mph. Subaru's system won't activate if the speed differential is greater than ~30mph. NO system is designed to stop a car from running into a stationary object at 60 mph.

The technology just isn't there yet.
Some pieces of the technology are there, and improvements are coming. The video below, starting at its 6:10 point in time, shows a Tesla responding to a stopped truck, and even though it is only partly blocking its lane. It's not clear whether the truck was ever detected by the radar before stopping (it wasn't shown in the instrument panel before stopping, anyway). If it did not, then this is an example of Tesla's fleet learning for radar introduced with software 8.0 giving enough confidence about radar stationary-object detection to allow braking to begin immediately upon radar detection. That Tesla was only traveling at 30 mph, but at any speed that its automatic emergency braking is operational, it should have triggered at the same distance.

 
Last edited:
Some pieces of the technology are there, and improvements are coming. The video below, starting at its 6:10 point in time, shows a Tesla responding to a stopped truck, and even though it is only partly blocking its lane. It's not clear whether the truck was ever detected by the radar before stopping (it wasn't shown in the instrument panel before stopping, anyway). If it did not, then this is an example of Tesla's fleet learning for radar introduced with software 8.0 giving enough confidence about radar stationary-object detection to allow braking to begin immediately upon radar detection. That Tesla was only traveling at 30 mph, but at any speed that its automatic emergency braking is operational, it should have triggered at the same distance.


On re-reading, I see I was not clear enough above. I can't edit that post again, so here's a clarification. I won't be surprised if Tesla's technology continues to improve, but "improvements are coming" was meant to refer to the entire industry, not just Tesla.