Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
This is an odd road layout.
Green lines are a bicycle lane, compare to picture in above post.
Blue circles are street lights I can easily ID.
View attachment 288126

That's a good view with annotation, however, it's missing the other street light in the same vicinity across from the one on the NB N Hill Ave lanes where the accident occurred. It's located in the center median area at the left roadway a bit north and a distance from the large tree. It extends out over the left turn roadway a bit so wouldn't be obscured by the tree canopy. You don't see a shadow from it due to the postion of the sun. There's also another street light across from the one circled in the lower right of the photo. It's also in the median area at the paved walkway and illuminates the road like the one across from it. You can see it in the google maps link below if your rotate the photo.

It would seem like she picked a location to cross where there was lighting. Unless things have changed a lot since the July 2017 image of the roadway was taken, it doesn't appear to me that there was a lot of tall shrubbery that would have hidden her. Actually in this view the landscaping looks pretty "desert sparse".

Here's a street view of both street lights on opposite sides of the road where the accident happened:

Google Maps

Here's another perspective, overhead, of the roadways and center median area.

Google Maps
 
Last edited:
Thanks @SMAlset for the Google Map link.

I put my notes on.

QSBTAPk.jpg


Further down the road, lane #5 would become a bicycle lane and right turn lane.

Suppose the pedestrian was not crossing but she was riding within the bicycle lane, could Uber know how to avoid running over a bicycle further down the road?

In San Francisco, bicyclists were witnessing that Uber Autonomous Vehicles had problems with bicycles as Uber were making right hook where it failed to merge into the bicycle lane but cut across it instead.

Any how, I am not trying to give Autonomous Industry a bad name. I just want it to be transparent, identify problems and fix them.

It's true that pedestrian should use cross walk but when we introduce technology into the scenario, we need to find out why it couldn't save a human life.

It's the same way that people shouldn't jump off a cliff and fall down and got killed. However, when technology allows jumpers to wear a parachute and land nicely, we should expect that's what a parachute does--saves life. If a parachute doesn't open and a jumper dies, we need to find the cause and fix it.

Waymo could have recognized this pedestrian scenario 2 or 3 football field away so we need to find out what's wrong with Uber!
 
Last edited:
I'm glad the NTSB was called into this and Uber's self-driving tech deserves intense scrutiny just as Joshua Brown's Tesla accident in Florida did. Given this was essentially a 5-lane, wide-open, straight road this happened on in ideal weather conditions, it shouldn't be an edge case for the car at this point in testing. The NTSB will also be better equipped than the local police to understand the capabilites of the vehicle and review the car's software of the view of the safety driver's head/eye movement right before impact.

I just saw this news story pop up on Daily Mail* about the various traffic violations the safety driver has on his record. Includes photo of his tickets and more photos of the area. I haven't had to take a car service but have to say I'm not really comfortable with Uber's hiring policy and background checks. If I'm not mistaken cab company drivers have to go through way more stringent checks and really don't see why companies like Uber shouldn't also.

The article (*forewarning DM features scantily clad women and I usually like to link to different news sources but didn't see any listed in references) also has several new or contradicting items in it from what we've previously heard. One, that the safety driver said she tried to brake. Two, it says police have not said which side of the car the pedestrian was approaching from. That directional information directly contradicts the info presented in the press conference by Sgt. Ronald Elcock. Watch the video here (start listening at 0:22): Tempe police talk about self-driving Uber accident So was the direction of travel not relayed correctly?

Updated: After watching the video in Post #104 below, from the front dashcam and the camera on the driver, it's clear she entered the road from the left median side walking the bike and she was facing the car. Also that the safety driver was distracted looking down for sometime and no wonder he didn't have time to react. Thanks @Spidy for finding that video. Glad they released it. I'd say the pedestrian will be partially at fault for jaywalking but the safety driver was clearly not monitoring the situation so at fault and have to wonder why Uber's equipment didn't see her, give an alert or brake on it's own so fault there as well.
 
Last edited:
From the video I can't see how Lidar wouldn't have been able to see her crossing. I can also say that it appears the headlights are really horrible on that Volvo. And it appears to me that the "safety driver" was using her phone and only occasionally looking up at the road. (So it is no wonder that she was reported as having said that the first she knew of anything was when she heard the accident.)
 
Headlights on cars with the new LED technology are bad. There was a Consumer Report testing a number of cars and their ability to effectively light a wider path and I'm not sure if any of them did a good or acceptable job. Tesla didn't score well either.

BTW if you watch the video directly on YouTube (if unfamiliar click YouTube link at bottom of window) you can stop the video mid-frame and still see what's on the screen whereas watching it here directly doesn't allow that.
 
Last edited:
The video shows a complete and utter failure of Uber's self-driving system. While I can see that a person relying only on their eyes could not have avoided this accident, it is completely unacceptable for a LIDAR system. LIDAR should have detected her a block away.

If their defense will be that the cameras could not see her, or that cameras have priority over other sensors, then why even use LIDAR in the first place?

This is not a case where a pedestrian suddenly stepped off the curb, she was in the middle of the road with no obstruction around her.
 
Last edited:
At 0:03 on the first video you start to see her illuminated as she is walking across the roadway after having already crossed 4 lanes with no other traffic around. I'm not sure how far ahead she is at this point, doesn't look far, but you at first only see her feet.

I hope this accident puts pressure on making changes to vehicle headlights. Personally I'm not a fan of the LEDs, find them hard on the eyes coming at you, and they really don't provide you with any good illumination ahead of you, let alone around corners from the Consumer Report review and posts I've read from people here on the forum. Honestly I'd rather see us go back to the old fashioned headlights.

Car Headlight Performance Found to Be Not So Bright

Are HID and LED Headlights Worth Buying?

There was an article they did that illustrated how each car was tested on the straight and curved roadway. Not seeing it now.
 
Last edited:
Why is this not a 100% human failure. There was a person in the car whose sole responsibility was to make sure the car performed correctly -- just as all drivers are supposed to. If a car's cruise control causes it to drive into a pole is that the fault of the cruise control system, or the person who is supposedly in control of the machine?
 
  • Disagree
Reactions: zmarty
The video shows a complete and utter failure of Uber's self-driving system. While I can see that a person relying only on their eyes could not have avoided this accident, it is completely unacceptable for a LIDAR system. LIDAR should have detected her a block away.

If their defense will be that the cameras could not see her, or that cameras have priority over other sensors, then why even use LIDAR in the first place?

This is not a case where a pedestrian suddenly stepped off the curb, she was in the middle of the road with no obstruction around her.

Uber Video Shows the Kind of Crash Self-Driving Cars Are Made to Avoid

“I think the sensors on the vehicles should have seen the pedestrian well in advance,” says Steven Shladover, a UC Berkeley research engineer who has been studying automated systems for decades and watched the video. “If she had been moving erratically, it would have been difficult for the systems to predict where this person was going,” he says, but the video shows no evidence of that.
 
Bear in mind that the video compresses a lot of information that would have been available to a human driver who was paying attention. The human eye can probably see contrast not captured in the video between an object (pedestrian) and deep background in a low lighting condition. The human eye also has a wider field of vision and is every good at catching motion in peripheral vision. Plus, the camera seems not to have adjusted well to low-light conditions (probably because of the bright lighting immediately in front of the car from the headlights). The camera makes it seem like this street with street-lights was pitch black except for the illuminated area directly in front of the car.

So: Car should have sensed the pedestrian through a number of sensors and the safety driver really should have seen the pedestrian using human eyes. No excuse for not seeing the pedestrian at all, and never activating the breaks (not even activating them too late to come to a complete stop in time).

Also, unless I'm wrong, Google/Waymo tests its cars with a team of two safety drivers, who trade off between sitting behind the wheel and doing data entry. Uber seems to be using just one safety driver on its tests. This is a huge problem. No one can passively sit behind the wheel of a car "monitoring" for a full shift and actually maintain sufficient uninterrupted attention. The job is just too passive/boring. Leave it to Uber (or Tesla for that matter) to go cheep on its testing protocols rather than use the safer gold-standard.
 
While I can see that a person relying only on their eyes could not have avoided this accident

Hold on.. stop it right there.

This only means that this Uber system is only as good or as bad as a human driver - and it can only get better from here, IFF you allow them to learn from this incident and continue the testing. I guess a human driver would have hit her 9 out of 10 times.

Except now that it was a machine that was driving, we have an opportunity to learn, fix and make it better.
 
Last edited:
  • Disagree
Reactions: zmarty
Comments on Hacker News overwhelmingly blame Uber as well: Tempe Police Release Video of Uber Accident | Hacker News

A sample:

"Why is the resolution so low?
Why is the FPS so low?
Why don't we see the IR/LIDAR footage?
Was the brightness adjusted after the fact to look favourably?
Why didn't the vehicle issue a full stop once the woman was clearly visible?
I feel PR handled."
 
Last edited:
Except now that it was a machine that was drive, we have an opportunity to learn, fix and make it better.

In my opinion Uber has been grossly negligent. I think their infamous corporate culture has lead to this accident.

And I hope there won't be major consequences for the whole self-driving industry.
 
Last edited:
Also folks who point out that LIDAR should have detected it.. i am guessing Lidar did do the detection, but the algorithm ignored that as spurious noise and decided not to take any action. It is not the detection, but how do you categorize it and what actions you decide to take, is all in the algorithm. And that is difficult.

if you are too aggressive you are going to get too many false postives and random break checks that are dangerous.
 
Comments on YCombinator News overwhelmingly blame Uber as well: Tempe Police Release Video of Uber Accident | Hacker News

A sample:

"Why is the resolution so low?
Why is the FPS so low?
Why don't we see the IR/LIDAR footage?
Was the brightness adjusted after the fact to look favourably?
Why didn't the vehicle issue a full stop once the woman was clearly visible?
I feel PR handled."

I think it a fair bet that either (I) the video that's on the internet was taken by a dashcam that's entirely separate from the cameras used by the self-driving system or (ii) the video did come from the self-driving camera, but (probably to reduce the storage space used over a long period of testing) contains vastly less frames and resolution than is available to the self-driving software from the camera.
 
  • Like
Reactions: NeverFollow