Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
It's true that the pedestrian violated the law against jaywalker but jurors would feel better if she was ticketed as a civil penalty rather than run over her and executed her for the violation.

On the other hand, I think the inattentive driver may be charged with Arizona law 28-693:

"A. A person who drives a vehicle in reckless disregard for the safety of persons or property is guilty of reckless driving."

Her job was to take over the control if the machine malfunctions or tries to kill somebody.

Failure to take control when her car was killing the jaywalker could result and did result in death.

Without the interior video, she could have argued that she did her best as required by her job but now that she got recorded on the video that she was intentionally distracted so it's hard for make a defense.
 
Without the interior video, she could have argued that she did her best as required by her job but now that she got recorded on the video that she was intentionally distracted so it's hard for make a defense.

I don't know that "did her best as required by her job" is quite the right standard. Uber doesn't get to absolve it's employees or itself of criminal liability by choosing to create it's own low standard of what constitutes "reckless disregard." The level of care that Uber defines as "doing her job" may not be a high enough level of care for operating a motor verhicle. Of course, here she wasn't doing what Uber wanted her to be doing; so she's clearly potentially on the hook criminally, as should be Uber.
 
I don't know that "did her best as required by her job" is quite the right standard. Uber doesn't get to absolve it's employees or itself of criminal liability by choosing to create it's own low standard of what constitutes "reckless disregard." The level of care that Uber defines as "doing her job" may not be a high enough level of care for operating a motor verhicle. Of course, here she wasn't doing what Uber wanted her to be doing; so she's clearly potentially on the hook criminally, as should be Uber.

And does Uber have negligence in that there is a driver watching camera, but the system does not go into failsafe if the safety driver is not being attentive? (Takes a little SW, but a simple eye/ face position detection would not be hard)
 
Here's the news article generated from the investigative braking/visibilty test videos on her tweet: Investigators recreate fatal crash involving self-driving Uber car

At least that's good news for Volvo in that without Uber's self-driving system, a driver behind the wheel could see and stop before hitting the bike in the lane.

I'm shocked! Shocked that the same Uber whose self-driving cars in Pittsburgh allegedly committed such offenses as driving the wrong way on one-way streets would have a car that failed to detect a pedestrian.

I'm shocked! Shocked that the same Uber that deliberately violates taxi laws left and right, violates laws governing the difference between contractors and employees left and right, stole tech from Google/Waymo, and illegally operated self-driving cars in California without a testing permit would use a low-quality dashcam to make it appear that the road was darker than it actually was.

But really, I'm just shocked that it took this long.

Uber is seriously rushing their self-driving tech to market long before it is actually ready.
 
Last edited:
The bias against jaywalker may be explained by the following clip: Cars that killed have successfully blamed their victims:
I've seen this phenomenon first hand in the case of a pedestrian hit by a city bus in downtown. She was in the crosswalk, but crossing at the very tail end of the yellow light. The driver, making a left from a one-way to a one-way, was probably not paying close attention and also trying to beat the light and managed to kill the pedestrian in the far lane from the turn. Everybody jumped to the defense of the poor bus driver, who was pretty shaken up. But they also cited law saying the pedestrian was at fault. I guess the person who is no longer here to defend herself gets slandered and doesn't get sympathy. Or it could be politics at play. Regardless, I was appalled.
 
I'm glad to see Elaine's sister has hired an attorney. He should be present to see all of the testing. I'm not saying that she wasn't in the wrong having crossed where she did, but both pedestrians and car drivers have a responsibility on the roads. Serves as a reminder life could all be gone in seconds unexpectedly. While the pedestrian died and the safety driver lived, I still wouldn't want to be in her situation right now.

It’s nice that the family for homeless show up to help in such a time of need.

And does Uber have negligence in that there is a driver watching camera, but the system does not go into failsafe if the safety driver is not being attentive? (Takes a little SW, but a simple eye/ face position detection would not be hard)

She did look up occasionally. Would it have been enough to keep supercruise happy?
 
She did look up occasionally. Would it have been enough to keep supercruise happy?
That's roughly what's keeping me from immediately jumping all over the attendant here. I don't think "driver" is even the proper word for her.

I still have a problem with what she appeared to be doing. I'm not sure how, if the Uber program was having as many troubles as it was, that she'd not be more attentive?

This could run deeper than the attendant, to a systemic thing. Perhaps she'd "learned" that there wasn't any problems until other vehicles came within a certain range, so assessed that this was a "low alert" area.

It's even possible that the Uber program management was pushing to get their "interventions" down. This could be creating a situation that was encouraging attendants to stay further away from the wheel so they didn't create false negatives by intervening when it wasn't truly needed. That sort of things can be subtle, simply by excessive dressing down of vigilant attendants, providing bonuses to attendants with low numbers of interventions, or just outright firing people that have higher numbers of interventions. Word gets around, and soon you've got nothing but attendants that pay poor attention and are air drumming as the car blows though seconds old red lights.

Speculation but hoping that the NTSB does a very thorough inspection of the whole program here looking for roots causes, so this stuff can get addressed.
 
  • Like
Reactions: S4WRXTTCS
She was walking too slowly and she was already in the Uber's lane. It's doubtful if she could get out of Uber's lane at her tortoise speed while the Uber was charging at her at 40 MPH.

Remember there were 5 lanes and she almost finished the last lane and almost escaped death if she was young, nimble and as fast as a wonder woman character in the comic.

I think the only appropriate accusation for this lady is: She did not use a crosswalk.

In a battle between a charging ahead 40 MPH SUV versus a pedestrian tortoise speed of may be less than 1 mile per hour, the tortoise would lose unless the tortoise was transformed to a very fast jumping rabbit.

Most likely, when she started the walk, the street was empty because at 40 MPH, the Uber was most likely still at the bridge and she could not see the Uber when it's hidden because of the curve at the bridge.

I think she did her job in yielding all cars because there were no cars when she started the crossing:

But slapping her with other accusatory language is inappropriate.

The Sheriff said the preliminary investigation put the pedestrian at fault.

Just saying the Sheriff might not be crooked, no more no less.

There is not a Death Penalty for failure to yield even in Arizona. Not sure about Florida though.

But there was good visibility in that area, and headlights are easy to see at night from the front. The pedestrian either did not look while crossing, or did not estimate impact. It MAY have been legal to cross there. Just because there is no crosswalk does not mean crossing is 'jaywalking'. It's based on distance from an intersection. If you are far away from an intersection, you may cross, but you must yield. It's the law.
 
I think the police statement at the beginning was totally based on seeing the dark dashcam view so willing to cut some slack but probably should not have speculated yet on fault and stayed neutral. Traffic cops in that area probably have a good idea what the lighting conditions were and maybe this person didn't. That said I think Tempe as being a hosting city for Uber's self-driving might play a bit into showing them some initial deference here. Everyone knows once all the data and video comes out things could change so nothing was going to be set in stone in any event. But the perception of homeless lady vs. ex-felon and large tech based company does make for news which I do think for the most part while laying it all bare, has been fairly well balanced.

I think Tam's point ( Any details on headline - Arizona pedestrian is killed by Uber self-driving car ) about at what speed the Uber was traveling and likely was back by the bridge when she first started across the 5 lanes of roadway makes a lot of sense and she very likely did look for oncoming traffic. If she did see the headlights as someone mentioned before in this thread, seeing two lights off in a distance can be hard to judge speed and distance of. If she could see well enough in front of her and the area she was traversing, she also probably would have thought any driver would too. Also unknown factor of vehicle approaching from down the road near the bridge is what lane they would be in. Had the vehicle been planning to make a left turn at E Curry towards the Marquee Theatre, then the vehicle and her crossing position at that point wouldn't have even been an issue.
 
  • Like
Reactions: daktari
That's roughly what's keeping me from immediately jumping all over the attendant here. I don't think "driver" is even the proper word for her.

I still have a problem with what she appeared to be doing. I'm not sure how, if the Uber program was having as many troubles as it was, that she'd not be more attentive?

This could run deeper than the attendant, to a systemic thing. Perhaps she'd "learned" that there wasn't any problems until other vehicles came within a certain range, so assessed that this was a "low alert" area.

It's even possible that the Uber program management was pushing to get their "interventions" down. This could be creating a situation that was encouraging attendants to stay further away from the wheel so they didn't create false negatives by intervening when it wasn't truly needed. That sort of things can be subtle, simply by excessive dressing down of vigilant attendants, providing bonuses to attendants with low numbers of interventions, or just outright firing people that have higher numbers of interventions. Word gets around, and soon you've got nothing but attendants that pay poor attention and are air drumming as the car blows though seconds old red lights.

Speculation but hoping that the NTSB does a very thorough inspection of the whole program here looking for roots causes, so this stuff can get addressed.

Uber has had issues with inattentive drivers, and in the past they did fire at least two safety drivers for it. One of those was the air drummer.
But, in both cases it was because someone else saw them. They didn't seem to be reviewing the logs to verify that the safety driver was actually paying attention. So it seems to be that it was a systematic problem in how Uber operated it's program.

The industry as a whole is well aware that a human safety driver is problematic. So there are no excuses for not implementing checks. It's a weak point that's accepted because I don't believe there is any alternative. You can't rely 100% on simulations because they don't recreate real life situations. This fatality accident is the exact kind of thing that likely wouldn't have happened in a simulation. Mostly because of what the bike/person appeared like. The way it looked is likely something the vision neural net was never trained on.

Doing a shadow mode thing in real life does work for some situations (like this one), but not all. It also can't recreate situations that only happen when a human driver has to deal with a machine which drives very differently than most humans. At some point along the way you need to put the autonomous car on the road with a safety driver behind the wheel.

The culture of loose regulations is likely causing a situation where autonomous cars are being put on the road well before they're ready for it. Where there is too much reliance on the safety driver. While simulation can't recreate every real life situations they can create a enough situations to evaluate whether to give a self-driving car company the license to operate on the road.

I do think Arizona should take some of the blame for this accident occurring because of how easily they grant a license to operate.

As far as I know they don't require releasing data like intervention counts. But, in some ways that requirement in California has caused an industry wide obsession with it.

I have the same concern you do that the use of intervention count as a measure of quality is harming the safety of them.
In this case I don't think safety driver had the qualifications to be a safety driver. To me it seems like Uber was hiring generic drivers because they only cared about miles, and they wanted to do it as cheaply as possible. This driver didn't care about driving, or about what the car was doing. Her driving attitude seemed to be "Hey, I'm being paid just to sit here on my phone".

The road is supposed to be for driving from point A to point B in a safe and prudent manner.

Now sure that's a bit laughable as that's hardly ever the case in real life. Everyone seems to have different agendas.

But, autonomous cars aren't supposed to be adding to the problem. They're supposed to be removing the messy human component from the road for those of us that just want to get from point A to point B where we no longer need to express ourselves on the road.

With Waymo/Cruise the safety drivers aren't taking over when they know the car is creating an unexpected situations on the road. A lot of the "no-fault" accidents happened because the cars were acting in an unexpected manner.

I think we have to face the fact that the lack of regulatory oversight has created a messy situation.

We have Tesla selling a FSD option on a car where there is no pathway to it. Where we're totally blind about how that can be achieved.

With L2 cars we have people overly confident in systems where there is no REAL measurement of what situations any of the systems can handle. Like I have no idea if my AP1 car has a better AEB system than a Subaru Eyesight system. The only thing we really have is user stories which are all over the place.

This Uber vehicle FAILED in what should be a basic test of an L2 car. It failed so badly I imagine the city safety feature of Volvo would have performed better had it been on the car.
 
Last edited:
Like I have no idea if my AP1 car has a better AEB system than a Subaru Eyesight system.
This is a very good point.. How many of the AEB systems from various manufacturers could have seen the pedestrian in this situation? How many could have seen the same pedestrian in broad daylight? Are the AEB capabilities any better on vehicles with limited auto steering? (Tesla AP, Propilot, SuperCruise, TrafficJam assist etc)?
 
...Does Uber’s car use Volvo’s technology or their own?...

Here goes your official answer:

Uber Technologies Inc. disabled the standard collision-avoidance technology in the Volvo SUV that struck and killed a woman in Arizona last week, according to the auto-parts maker that supplied the vehicle's radar and camera.

"We don't want people to be confused or think it was a failure of the technology that we supply for Volvo, because that's not the case," Zach Peterson, a spokesman for Aptiv Plc, said by phone. The Volvo XC90's standard advanced driver-assistance system "has nothing to do" with the Uber test vehicle's autonomous driving system, he said.

Aptiv is speaking up for its technology to avoid being tainted by the fatality involving Uber, which may have been following standard practice by disabling other tech as it develops and tests its own autonomous driving system. Experts who saw video of the Uber crash pointed to apparent failures in Uber's sensor system, which failed to stop or slow the car as 49-year-old Elaine Herzberg crossed a street pushing a bicycle.
 
The jig is up

Arizona Governor Suspends Uber’s Self-Driving Cars From Roads

“Improving public safety has always been the emphasis of Arizona’s approach to autonomous vehicle testing, and my expectation is that public safety is also the top priority for all who operate this technology in the state of Arizona,” Mr. Ducey said in his letter. “The incident that took place on March 18 is an unquestionable failure to comply with this expectation.”
 
Last edited:
The jig is up

Arizona Governor Suspends Uber’s Self-Driving Cars From Roads

“Improving public safety has always been the emphasis of Arizona’s approach to autonomous vehicle testing, and my expectation is that public safety is also the top priority for all who operate this technology in the state of Arizona,” Mr. Ducey said in his letter. “The incident that took place on March 18 is an unquestionable failure to comply with this expectation.”

Oh Buttershrimp hates being right when it comes to this subject.... honestly.

Now, who wants a new and improved video? No one? Nobody? Screw you guys, I'm posting anyway.

 
Intel used fatal Uber crash footage to show what its self-driving software would do

Experience Counts, Particularly in Safety-Critical Areas | Intel Newsroom

"To demonstrate the power and sophistication of today’s ADAS technology, we ran our software on a video feed coming from a TV monitor running the police video of the incident. Despite the suboptimal conditions, where much of the high dynamic range data that would be present in the actual scene was likely lost, clear detection was achieved approximately one second before impact. "

autonomous-screens-1000x277.jpg


So essentially Mobileye is saying they could have detected her even just using the camera feed, without any LIDAR. And just to drive the point home, they pointed the camera to a TV monitor with the video :)
 
  • Informative
Reactions: ℬête Noire
The jig is up

Arizona Governor Suspends Uber’s Self-Driving Cars From Roads

“Improving public safety has always been the emphasis of Arizona’s approach to autonomous vehicle testing, and my expectation is that public safety is also the top priority for all who operate this technology in the state of Arizona,” Mr. Ducey said in his letter. “The incident that took place on March 18 is an unquestionable failure to comply with this expectation.”

Maybe Ducey was listening last year, after all, and is finally coming around. Even if here it's more the reactive version of an approach that Musk tried to warn against.

 
  • Informative
Reactions: Dr. J