Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AAA Report: Automatic Emergency Braking with Pedestrian Detection

This site may earn commission on affiliate links.

diplomat33

Average guy who loves autonomous vehicles
Aug 3, 2017
12,723
18,690
USA
The AAA released their report on automatic emergency braking with pedestrian detection system. They tested the 2019 Chevy Malibu, the 2019 Honda Accord, the 2019 Tesla Model 3 and the 2019 Toyota Camry in a variety of scenarios. They found that the systems were not good enough.

Here are their key findings:

1. When encountering an adult pedestrian in a perpendicular crossing scenario:
a. Each test vehicle provided visual notification of an impending collision during each test run conducted at 20 mph.
i. In aggregate, a collision with an adult pedestrian target was avoided 40% of the time
ii. During an additional 35% of the time, collisions were mitigated by an average speed of 4.4 mph

b. At 30 mph, three out of four test vehicles failed to reduce the impact speed by at least 5 mph during the initial test run.

2. Evaluated pedestrian detection systems were significantly challenged in the following scenarios:
a. When encountering a child pedestrian at 20 mph, a collision was avoided 11% of the time in aggregate. An additional 25% of the time, collisions were mitigated by an average speed of 5.9 mph.
b. When encountering a pedestrian immediately after a right curve, none of the test vehicles mitigated the impact speed during any of the five test runs.
c. When encountering two pedestrians alongside the roadway at 20 mph, a collision was avoided 20% of the time in aggregate. An additional 35% of the time, collisions were mitigated by an average speed of 3.4 mph.

3. Evaluated pedestrian detection systems were ineffective during nighttime conditions.


Attached is the pdf of the full report if you want to dive deep into the details.
 

Attachments

  • Research-Report-Pedestrian-Detection.pdf
    2.3 MB · Views: 77
...b. When encountering a pedestrian immediately after a right curve, none of the test vehicles mitigated the impact speed during any of the five test runs..."

I don't see how Tesla FSD will be a reality until it can pass this kind of AAA test (as of now, 100% failures for all cars).

The "b" scenario is extremely very common: A turn to the right at a green light where pedestrians have the right of way to cross the road.

upload_2019-10-14_15-9-19.png
 
  • Like
Reactions: BCNY
I don't see how Tesla FSD will be a reality until it can pass this kind of AAA test (as of now, 100% failures for all cars).

The "b" scenario is extremely very common: A turn to the right at a green light where pedestrians have the right of way to cross the road.

Well, by definition, it would not be reliable FSD if it can't pass this test. But I believe Tesla can solve it with good vision NN. The wide and side cameras can see the pedestrian. So if the NN is good enough to track the pedestrian and know that the pedestrian might intersect with the car when the car makes the turn, then the car will be able to avoid hitting the pedestrian.
 
  • Informative
Reactions: pilotSteve
This test is a little misleading. Unless a vehicle brakes when it sees a pedestrian between 2 parked cars it will never have enough time to react. If you think we see allot of phantom braking now, what will happen if it starts braking at every person waiting to cross the street.

Pedestrians need to pay more attention and stop looking at their phones when walking across the road.

Look at the last person who got hit by an autonomous vehicle. Had zero reflective clothing, late at night and was on the road. A little common sense goes a long way
 
  • Like
Reactions: PhaseWhite
...Look at the last person who got hit by an autonomous vehicle. Had zero reflective clothing, late at night and was on the road. A little common sense goes a long way

I don't think what that homeless lady wore was relevant because the Uber's Automation system detected her fine very far away with its LIDAR regardless of how she wore her clothes.

She died because the safety officer didn't know that although all the sensors worked fine as displayed on the dashboard, the automation was disconnected from the braking system so the manual brake was required in this scenario.
 
The AAA released their report on automatic emergency braking with pedestrian detection system. They tested the 2019 Chevy Malibu, the 2019 Honda Accord, the 2019 Tesla Model 3 and the 2019 Toyota Camry in a variety of scenarios. They found that the systems were not good enough.

Here are their key findings:

1. When encountering an adult pedestrian in a perpendicular crossing scenario:
a. Each test vehicle provided visual notification of an impending collision during each test run conducted at 20 mph.
i. In aggregate, a collision with an adult pedestrian target was avoided 40% of the time
ii. During an additional 35% of the time, collisions were mitigated by an average speed of 4.4 mph

b. At 30 mph, three out of four test vehicles failed to reduce the impact speed by at least 5 mph during the initial test run.

2. Evaluated pedestrian detection systems were significantly challenged in the following scenarios:
a. When encountering a child pedestrian at 20 mph, a collision was avoided 11% of the time in aggregate. An additional 25% of the time, collisions were mitigated by an average speed of 5.9 mph.
b. When encountering a pedestrian immediately after a right curve, none of the test vehicles mitigated the impact speed during any of the five test runs.
c. When encountering two pedestrians alongside the roadway at 20 mph, a collision was avoided 20% of the time in aggregate. An additional 35% of the time, collisions were mitigated by an average speed of 3.4 mph.

3. Evaluated pedestrian detection systems were ineffective during nighttime conditions.


Attached is the pdf of the full report if you want to dive deep into the details.

I saw this last week and it was interesting to compare to the Euro NCAP test results. (I don't know anything about the exact NCAP test details or results and how they differ from AAA, but I do know it passed NCAP - stopping for pedestrians and cyclists reliably - they did not show any collisions, if they occurred.)

Without knowing how different the tests really were (obviously there are a lot of differences), to me it looks like a case of designing the vehicle to respond and pass the required testing authority's test. But this means it's not very good at passing other similar scenarios. The vehicle is designed to the test.

Hopefully it will improve. It's a far cry from an attentive human still.
 
Last edited:
  • Like
Reactions: snellenr
Pedestrians need to pay more attention and stop looking at their phones when walking across the road.

Look at the last person who got hit by an autonomous vehicle. Had zero reflective clothing, late at night and was on the road. A little common sense goes a long way
So... blame the victim?
I agree that people should not be idiots.
But when one acts like an idiot, the purpose of the safety system is to prevent them from paying for that mistake with their lives.
It can happen to any of us. Training the victim is not the solution.

The obvious human-style middle ground here is to slow down when there's uncertainty. It reduces the magnitude of a potential collision, increases the amount of time to evaluate probability, and decreases the stopping distance if the object is determined stop-worthy. It's the reason we slow down as we drive past a kid playing kickball on a driveway. I'd much rather be 2 seconds late for work than dismember someone who's phone vibrated as they entered the crosswalk.
 
So... blame the victim?
I agree that people should not be idiots.
But when one acts like an idiot, the purpose of the safety system is to prevent them from paying for that mistake with their lives.
It can happen to any of us. Training the victim is not the solution.

The obvious human-style middle ground here is to slow down when there's uncertainty. It reduces the magnitude of a potential collision, increases the amount of time to evaluate probability, and decreases the stopping distance if the object is determined stop-worthy. It's the reason we slow down as we drive past a kid playing kickball on a driveway. I'd much rather be 2 seconds late for work than dismember someone who's phone vibrated as they entered the crosswalk.

Absolutely was part of the problem. The victim was walking on the road in the middle of the night with dark color clothing. she was not on the sidewalk or crosswalk. Play dumb games win stupid prizes.
 
So... blame the victim?
I agree that people should not be idiots.
But when one acts like an idiot, the purpose of the safety system is to prevent them from paying for that mistake with their lives.
It can happen to any of us. Training the victim is not the solution.

The obvious human-style middle ground here is to slow down when there's uncertainty. It reduces the magnitude of a potential collision, increases the amount of time to evaluate probability, and decreases the stopping distance if the object is determined stop-worthy. It's the reason we slow down as we drive past a kid playing kickball on a driveway. I'd much rather be 2 seconds late for work than dismember someone who's phone vibrated as they entered the crosswalk.

If they are in a crosswalk (phone or no phone) you in big trouble.

I'm all for these systems to protect from stupid acts, not as a crutch, but just as a back up, eventually they will be better than humans. Probably not too long from now.
 
I have no idea how advanced FSD is at this point and nor does anyone else. Only a very small number of Tesla cars have the latest AP hardware (the hardware that is required for FSD to work) and even those of us that do have that hardware, my understanding is that we were (not sure anymore) running software designed for the previous AP version using an emulator. I don't have a source for that but in any case we are not running the latest FSD build in an optimum mode on any cars. That said, I have no doubt that there are huge obstacles left to overcome and even once the software and NN are basically feature ready there will be a year or more of fine tuning and more data collection when everyone is running the full FSD software on the correct hardware. I'm just saying that the behaviour of the current fleet is not necessarily an accurate indicator of FSD progress. I'm interested to see if Tesla responds to this and if they make any changes to their software as a result.
 
Absolutely was part of the problem. The victim was walking on the road in the middle of the night with dark color clothing. she was not on the sidewalk or crosswalk. Play dumb games win stupid prizes.


According to the NTSB's preliminary report, none of those reasons above was a factor because:

"According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). 2 According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."

So at 6 seconds before the impact, the system detected the pedestrian while the car traveled at 43MPH

43MPH means 43MPH/3,600seconds in an hour or 0.0119 miles per second

0.0119 miles per second x 6 seconds = 0.072 miles in 6 seconds or 378 feet.

At 378 feet away, the system had already detected the pedestrian. It's not the sensor problem nor the pedestrian's problem. It's the software that was disconnected from the braking system and the safety officer didn't know that fact to prepare to do a manual brake.

The investigators recreated the accident scene at night and they had no problem in avoiding the collision because there were just too much time and distance for a manual brake and there were plenty of street lights for human drivers and Uber's High Definition Cameras (not poor quality generic dashcam) to spot a pedestrian.

As you can see, the street lights were bright enough for human eyes to count how many lanes there are and whether the lanes stay the same or expanded on both sides all the way to the next block (the red circle was where the collision happened:

CLfg2hG.jpg
 
According to the NTSB's preliminary report, none of those reasons above was a factor because:

"According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). 2 According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."

So at 6 seconds before the impact, the system detected the pedestrian while the car traveled at 43MPH

43MPH means 43MPH/3,600seconds in an hour or 0.0119 miles per second

0.0119 miles per second x 6 seconds = 0.072 miles in 6 seconds or 378 feet.

At 378 feet away, the system had already detected the pedestrian. It's not the sensor problem nor the pedestrian's problem. It's the software that was disconnected from the braking system and the safety officer didn't know that fact to prepare to do a manual brake.

The investigators recreated the accident scene at night and they had no problem in avoiding the collision because there were just too much time and distance for a manual brake and there were plenty of street lights for human drivers and Uber's High Definition Cameras (not poor quality generic dashcam) to spot a pedestrian.

As you can see, the street lights were bright enough for human eyes to count how many lanes there are and whether the lanes stay the same or expanded on both sides all the way to the next block (the red circle was where the collision happened:

CLfg2hG.jpg

Heres the video of the accident. You wasted how much time writing that B.S.?
Screenshot_20191014-233510_YouTube.jpg


 
  • Disagree
Reactions: caligula666
...video of the accident

Again, the video is not Uber's High Definition Cameras. It's a poor poor quality generic dashcam with improper contrast display.

You know it is an improper contrast because the street lights on the right, on the left and other lights are shown as if they are very dim.

Even if the system's High Definition camera did not pick up the pedestrian, NTSB said the system did 6 seconds before the impact traveling at 43MPH.