Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AAA Report: Automatic Emergency Braking with Pedestrian Detection

This site may earn commission on affiliate links.
Again, the video is not Uber's High Definition Cameras. It's a poor poor quality generic dashcam with improper contrast display.

You know it is an improper contrast because the street lights on the right, on the left and other lights are shown as if they are very dim.

Even if the system's High Definition camera did not pick up the pedestrian, NTSB said the system did 6 seconds before the impact traveling at 43MPH.

The NTSB didnt find the driver at fault. For good reason. You claim the accident happened on the side of the road and even falsely marked it.
20191015_051044.jpg
That location is where the victim ended up after. It happened on the street. You can see the victim running across the road with dark clothing. The picture you posted was on a different night and does not represent the current conditions of the accident. IE moon lighting or overcast. Stop falsifying what happened. The video is right there showing the victim and conditions at the time. You run across the road at night in dark clothing you will eventually get run over. Probably wouldn't of even mattered if it was an autonomous vehicle or elderly person behind the wheel.
 
The NTSB didnt find the driver at fault. For good reason. You claim the accident happened on the side of the road and even falsely marked it. View attachment 466290 That location is where the victim ended up after. It happened on the street. You can see the victim running across the road with dark clothing. The picture you posted was on a different night and does not represent the current conditions of the accident. IE moon lighting or overcast. Stop falsifying what happened. The video is right there showing the victim and conditions at the time. You run across the road at night in dark clothing you will eventually get run over. Probably wouldn't of even mattered if it was an autonomous vehicle or elderly person behind the wheel.

Granted not many human drivers would have picked that up, and Victim had no brains. But it’s curious why Radar / Lidar (which I assume Uber has) didn’t pick it up. It’s straight dead ahead and not driving that fast.
 
  • Like
Reactions: Ludalicious
Granted not many human drivers would have picked that up, and Victim had no brains. But it’s curious why Radar / Lidar (which I assume Uber has) didn’t pick it up. It’s straight dead ahead and not driving that fast.

The Uber's radar and lidar did pick it up. The Automatic Emergency Braking was turned off. So even though the car detected the person, it did not apply the brakes. And since the safety driver was complacent, probably assuming that the car would brake, he did not apply the brakes in time either. Hence, why the accident happened.
 
  • Informative
Reactions: mswlogo
The Uber's radar and lidar did pick it up. The Automatic Emergency Braking was turned off. So even though the car detected the person, it did not apply the brakes. And since the safety driver was complacent, probably assuming that the car would brake, he did not apply the brakes in time either. Hence, why the accident happened.
The Uber's radar and lidar did pick it up. The Automatic Emergency Braking was turned off. So even though the car detected the person, it did not apply the brakes. And since the safety driver was complacent, probably assuming that the car would brake, he did not apply the brakes in time either. Hence, why the accident happened.

Under the circumstance as indicated in the original post though. This scenario is unavoidable when a pedestrian walks out from between vehicles. Tesla can and will identify the object but they can't identify stupid. If the vehicle stopped every time it detected someone standing between vehicles we would have allot of phantom braking. This is something that is not avoidable ever based on the vehicles speed and the distance needed to stop. Let's also point out that in every one of those test's the pedestrian being used is breaking the law and jaywalking across the street. I am curious to see how well the Testa does under conditions with a marked crosswalk.
55179693.jpg
 
This is something that is not avoidable ever based on the vehicles speed and the distance needed to stop. Let's also point out that in every one of those test's the pedestrian being used is breaking the law and jaywalking across the street. I am curious to see how well the Testa does under conditions with a marked crosswalk.

Just goes to show how good FSD has to be to be much safer than a good human driver! A good human driver will see the bottom of those legs moving under the car, estimate trajectory and likelihood of entering the street, apply appropriate slowing, and reduce risk of collision. Humans are anticipatory, not reactive. Their main shortcomings are their reaction time is 250ms, rather than essentially instantaneous, and their alertness is highly variable. And they have only two eyes. But that crazy neural net certainly makes up for a lot of that!
 
The Uber's radar and lidar did pick it up. The Automatic Emergency Braking was turned off. So even though the car detected the person, it did not apply the brakes. And since the safety driver was complacent, probably assuming that the car would brake, he did not apply the brakes in time either. Hence, why the accident happened.

Aah, sorry I missed those details.
Not sure why all the shutdown and investigation.

To be honest I worry about this in the Tesla. That I think I have AutoSteer on, but I forgot it canceled a moment ago and it’s a oh *sugar* forgot to reengage. Party because the car tracks the road so well that the muscle memory thinks it’s on.

Also having overconfidence when AutoSteer is on.
 
Aah, sorry I missed those details.
Not sure why all the shutdown and investigation.

To be honest I worry about this in the Tesla. That I think I have AutoSteer on, but I forgot it canceled a moment ago and it’s a oh *sugar* forgot to reengage. Party because the car tracks the road so well that the muscle memory thinks it’s on.

Also having overconfidence when AutoSteer is on.

Fortunately now that all AP2+ cars have ELDA and LDA, between that and the "you recently disengaged AP and started drifting" warnings, they sure seem to be trying to prevent that.

The issue with Uber is that they clearly rushed a fast and loose variant of what Waymo has been doing. The same person is meant to be the safety driver AND the neural net annotator. Combined with a car that's just mainly following HD maps and does not have much else in terms of safety/collision detection enabled yet, that's a disaster waiting to happen. I bet it's more they analyzed the situation and realized there's not a viable way they could prevent a recurrence of this incident without making dramatic changes and thought it best to shut down.

That was the era of self-driving scams. Put enough spinning pucks, cameras, and weird sensors on a car and the public believes you have a self driving program.
 
...That location is where the victim ended up after. It happened on the street...

True.

...dark clothing...

True

...Stop falsifying what happened. The video is right there showing the victim and conditions at the time...

NTSB & police have been recreating the accident scene repeatedly.

The daschcam video is the false representation of human eye perception because it is not Uber's HD cameras.

That's falsifying the evidence because police said an average driver could see the pedestrian in 637 to 818 feet away.

...The NTSB didnt find the driver at fault...

I don't see how vindicating human driver relevant in this thread which is about the advance or failure of the automation as it's gone through a test.

To clarify:

Uber is not under the thread of criminal investigations because its automation worked exactly as intended: Human driver is hired to override the automation at any time and if not, a collision must be expected.

It's the human driver who is at the threat of criminal charges because the human's failure to apply manual brake when the automatic brake didn't kick in.

Recently, Autopilot saved the lives of a bear family who didn't wear bright furs in a dark road with no street lights by stopping the Model 3 Performance from 50MPH:


In summary, the clothe (or fur) factor is no factors in this automation scenario as the accident scene recreations from NTSB and police have found.
 
Last edited:
  • Like
Reactions: cucubits
AAA report didn't mention what Tesla software version they were doing as far as I know. How do we know the results would be the same today, or even in a few months? Meanwhile, competitors typically release major ADAS software updates at model year changes.
 
AAA report didn't mention what Tesla software version they were doing as far as I know. How do we know the results would be the same today, or even in a few months? Meanwhile, competitors typically release major ADAS software updates at model year changes.

Very good point about the future. However, I still think these kinds of tests are valuable so we can be aware of the current capabilities and don't overestimate its capability.

For example, people got into garage crashes with Simple Summon so I was looking forward to the new Smart Summon to solve that problem:


But it has turned out that Smart Summon has the same exact problem of garage crashes as if it's still Simple Summon!

I will just have to wait for the future a little longer.