Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
The other video you haven't found that is also linked in this thread is the video taken by someone else using their phone driving the same chunk of road at night showing that that video is quite deceptive in just how dark that area is. It's a typical urban road, lit up like a Christmas tree. Something is VERY wack with the gamma on that Uber video that the police released.

I don't see off-hand where that is but here's a still shot of the area. Any details on headline - Arizona pedestrian is killed by Uber self-driving car

P.S. The driver isn't a "he".

Thank you for pointing that out to me! Woah, that street really isn't that dark and the visibility is great.
 
  • Like
Reactions: ℬête Noire
If only they came forwards when she was homeless.

Probably not a fair comment without knowing the circumstances. The victim had had a few years there with drug arrests, 6 or more I think, and may have had other issues like mental. Many times people try to help out family members but especially when drugs are involved that isn't always possible. From what I've read and even seen in news stories many homeless prefer to live on the street over chosing some shelter or even their family home where interpersonal issues come into play. We also don't know if some of the family might have been checking in on her and taking her things or food she needed. I did read a comment from a friend of hers that she had been attempting to get a job recently.
 
Interesting Opinion criticizing weaker Arizona Autonomous Vehicle rules.

Also, it cites a statistic of 1 annual fatality per 100 million miles of human driving VS 10 million miles of Autonomous Vehicle driving.

That means currently, human is 10 times safer than Autonomous Vehicles are and Autonomous Vehicles are 10 times worse than human is!

Updated: link corrected

Ducey's Drive-By: How Arizona Governor Helped Cause Uber's Fatal Self-Driving Car Crash
 
Last edited:
  • Informative
Reactions: daktari
Interesting Opinion criticizing weaker Arizona Autonomous Vehicle rules.

Also, it cites a statistic of 1 annual fatality per 100 million miles of human driving VS 10 million miles of Autonomous Vehicle driving.

That means currently, human is 10 times safer than Autonomous Vehicles are and Autonomous Vehicles are 10 times worse than human is!

Watch Video of Scottsdale Victims Minutes Before Fatal Plane Crash on Golf Course

It was an automated plane? (or is that the wrong link?)

How many autonomous fatalities have there been?
 
Another interpretation with the same conclusion:

Uber’s self-driving software detected the pedestrian in the fatal Arizona crash but did not react in time

That means both sensors and software detected the pedestrian fine but the settings from software programmers have allowed such accident to happen.

If they didn't set their software to such settings:
"Those rides can be clumsy and filled with hard brakes as the car stops for everything that may be in its path."

It sounds like Uber has chosen smoothness and comfort over safety.
 
Last edited:
Another interpretation with the same conclusion:

Uber’s self-driving software detected the pedestrian in the fatal Arizona crash but did not react in time

That means both sensors and software detected the pedestrian fine but the settings from software programmers have allowed such accident to happen.

If they didn't set their software to such settings:


It sounds like Uber has chosen smoothness and comfort over safety.

Maybe smoothness, but I'm still going with the leading and trailing wheel spokes (along with small tube cross section) caused the rain/ snow filter to activate thus 'seeing' through the pedestrian to the road beyond.
 
Maybe smoothness, but I'm still going with the leading and trailing wheel spokes (along with small tube cross section) caused the rain/ snow filter to activate thus 'seeing' through the pedestrian to the road beyond.

That's a good explanation and theory.

However, according to the articles, there's nothing wrong with the sensors.

It's the immatured Uber software that is not advanced enough to classify pedestrian/bicycle as important obstacles to avoid.

LIDAR can see 3-dimensional. It can measure that a pedestrian/bicycle has thickness for software engineers to write a code to avoid hitting it and not just a paper thin flat object that software engineers to write a code to run over it.

On the other hands, for years, other companies like Waymo have been able to recognize and classify many objects including pedestrians and bicyclist below:


leddartech-lidar.jpg
 
Last edited:
That's a good explanation and theory.

However, according to the articles, there's nothing wrong with the sensors.

It's the immatured Uber software that is not advanced enough to classify pedestrian/bicycle as important obstacles to avoid.

LIDAR can see 3-dimensional. It can measure that a pedestrian/bicycle has thickness for software engineers to write a code to avoid hitting it and not just a paper thin flat object that software engineers to write a code to run over it.

On the other hands, for years, other companies like Waymo has been able to recognize and classify many objects including pedestrians and bicyclist below:


leddartech-lidar.jpg

My thinking is the application processing of the LIDAR, not the sensor itself/ raw output. Assuming rain/ snow handling is done outside the raw point cloud, the Uber SW could ignore the region of the pedestrian due to the near/far return noise of hitting the spoke/ rims. In your image, the bicyclist tires are barely there and spoke not at all. Excessive squelch could blank the region.
Just a guess.
 
Interesting Opinion criticizing weaker Arizona Autonomous Vehicle rules.

Also, it cites a statistic of 1 annual fatality per 100 million miles of human driving VS 10 million miles of Autonomous Vehicle driving.

That means currently, human is 10 times safer than Autonomous Vehicles are and Autonomous Vehicles are 10 times worse than human is!

Updated: link corrected

Ducey's Drive-By: How Arizona Governor Helped Cause Uber's Fatal Self-Driving Car Crash

A sample size of 1 is hardly meaningful, and there is some question whether Uber was testing the system without LIDAR input. They would save a lot of money if they could drop the LIDAR.

If you count automobile occupant deaths and injuries, right now AV is winning. It doesn't get drunk and run into trees very often.
 
Preliminary report from NTSB is out for this crash: https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf

This is a good reminder that folks operating experimental stuff on the roads shouldn't be allowed to more-or-less entirely self-regulate. The way Uber set up its test cars in Arizona was absolutely insane, and bordering on criminal:

"The vehicle was factory equipped with several advanced driver assistance functions by Volvo Cars, the original manufacturer. The systems included a collision avoidance function with automatic emergency braking, known as City Safety, as well as functions for detecting driver alertness and road sign information. All these Volvo functions are disabled when the test vehicle is operated in computer control but are operational when the vehicle is operated in manual control.

According to Uber, the developmental self-driving system relies on an attentive operator to intervene if the system fails to perform appropriately during testing. In addition, the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.

* * *

According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
 
According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.”

Those three facts together -- wow what a stupid way to test their system.
 
  • Like
Reactions: MP3Mike and mongo