Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
Seems like the logic in the software needs to be improved. According to the report "the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact" and "At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision". Doesn't seem like it's a sensor issue but rather programming logic.
 
Seems like the logic in the software needs to be improved. According to the report "the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact" and "At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision". Doesn't seem like it's a sensor issue but rather programming logic.
It's not even logic, it's a setting.

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
 
I think they have their own software that controls the vehicle so I can see why they wouldn't want conflicting controls with the original manufacturers controls.

My understanding: the Volvo stuff is disabled regardless. The Uber software saw the pedestrian and knew it needed to stop but the Uber SW is not allowed to emergency brake nor does it alert the driver to.
 
My understanding: the Volvo stuff is disabled regardless. The Uber software saw the pedestrian and knew it needed to stop but the Uber SW is not allowed to emergency brake nor does it alert the driver to.

That's the way I read it as well. Uber not only overrode Volvo's AEB and other safety features, but also overrode the ability of its own self-driving system to execute an emergency stop. And the system want designed to give notice to the driver when or believed an emergency stop was necessary.
 
Not surprising they Volvo Systems we’re disabled. But the Uber software is sorely lacking..
1. Not identifying the initial object 6 seconds out until 1.3 seconds as an actual threat.
2. Not initiating emergency braking for fear of false positives (just like so many radar systems that don’t brake for stationary things like... fire trucks).
3. Not alerting the (dubiously named) “safety driver”.

Fail.

#2 though, false positives, Is a big issue for many of these sensor systems.
 
Not surprising they Volvo Systems we’re disabled. But the Uber software is sorely lacking..
1. Not identifying the initial object 6 seconds out until 1.3 seconds as an actual threat.
System identified it early, but it was not a critical issue (in the direct path of the car) until 1.6 seconds before, until that point, the SW determined it could have been steered in safe direction by the driver.
 
System identified it early, but it was not a critical issue (in the direct path of the car) until 1.6 seconds before, until that point, the SW determined it could have been steered in safe direction by the driver.

This wasn't a driver assist system, but an autonomous system. It never assumes that the operator will take charge. Prior to 1.6 seconds, it didn't think that the pedestrian and her bicycle were an actual object that would enter the car's path (as opposed to an artifact or a forward moving/stopped object in another lane), so it didn't react (slow or change lanes). Then the car determined it was an emergency stop situation, but didn't execute an emergency stop because it was software limited not to do those.
 
  • Helpful
Reactions: mongo
This wasn't a driver assist system, but an autonomous system. It never assumes that the operator will take charge. Prior to 1.6 seconds, it didn't think that the pedestrian and her bicycle were an actual object that would enter the car's path (as opposed to an artifact or a forward moving/stopped object in another lane), so it didn't react (slow or change lanes). Then the car determined it was an emergency stop situation, but didn't execute an emergency stop because it was software limited not to do those.

Good points. It did not do proper collision prediction. It was also lacking in that it did not do non-emergency deceleration prior to impact. I wonder if changing the type of object it thought it was reset the path tracing...
 
It would seem they are not making good use of their “safety driver”. Rather than turning off emergency braking (or steering) and not notifying the hapless occupant, why not at least raise an alarm? It may not have been enough time to divert disaster but at least a human could make real time assessments about whether an object is a real threat or a false positive? What’s the harm in annoying a paid occupant with a few false positives?
 
It would seem they are not making good use of their “safety driver”. Rather than turning off emergency braking (or steering) and not notifying the hapless occupant, why not at least raise an alarm? It may not have been enough time to divert disaster but at least a human could make real time assessments about whether an object is a real threat or a false positive? What’s the harm in annoying a paid occupant with a few false positives?

I would have absolutely had a vibrating seat and warning chime at a minimum for doing testing. Most companies use a driver and an engineer when doing AV testing.

They took one two many cost cutting methods, and ended putting KILLER AV into the headlines.
 
Self-Driving Uber Crash 'Avoidable,' Driver's Phone Playing Video Before Woman Struck

Police investigation wrote that the driver was looking down at her phone to watch the talent show "The Voice" on Hulu.

It stated 85 percent of motorists could spot the victim 143 feet down the road in that condition.

It cited that Uber confirmed that "she had been trained on the capabilities and limitations of the vehicle. Her job was to look at the road and prepare for any emergencies."
 
So, an example of SDIC. Self-driving induced complacency.

People seem to be jumping the gun on the whole Eyes Off, Mind Off* thing.

Problem for many/all is that this was never a part of their Driver Ed.

* a MobilEye forward -looking slogan.
 
I wonder if Uber is a good place to work as it blamed its employee because "she had been trained on the capabilities and limitations of the vehicle."

Did it disclose to its drivers that its sensors can detect obstacles fine but its software is a different story because it was purposefully programmed to ignore obstacles at that time so it would not interrupt her cell phone activities before it could kill a victim?
 
  • Like
Reactions: daktari and croman
So Uber in fact had a level 2 system, but pretended it was level 4. And told their "safety drivers" what?

No, it was Level 1. It had no ability to disengage and hand off control to a driver, sure, but it also wasn't aware enough to operate as L4. It was just a fancy L1 (ACC).

No AEB is really pathetic. No wonder Kalanick was all over Elon when AP2 was announced.
 
  • Like
Reactions: daktari
I wonder if Uber is a good place to work as it blamed its employee because "she had been trained on the capabilities and limitations of the vehicle."

Did it disclose to its drivers that its sensors can detect obstacles fine but its software is a different story because it was purposefully programmed to ignore obstacles at that time so it would not interrupt her cell phone activities before it could kill a victim?

Its not a good look for anyone but Uber is desperate. They already know their FSD program is kaput and they settled with this poor woman's estate (I believe) but in the court of public opinion they are only lucky most of the public has no idea or cares about these kinds of issues.