Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
I don't get it. You have cameras, you have radar and you have a Lidar system. You have so much redondancy. The code must just be terrible that's the only explanation I can think of, which would not surprise me one bit for a company that is essentially an online taxi company. They aren't google, Tesla, Microsoft or Apple.
 
  • Like
Reactions: NeverFollow
This was published on the accident in the San Francisco Chronicle:

Exclusive: Tempe police chief says early probe shows no fault by Uber

Something doesn't add up here. If she stepped out from the center median directly into the path of the Uber car, without time for it or its driver to react, (as the police chief says) how was the damage to the right front corner of the Uber car? How could she possibly walk (while pushing a bike no less!) fast enough to get most of the way across the lane and still not give a camera and radar/lidar-equipped vehicle time to react? Was she the Flash's sister?

If the police chief misspoke and meant to say that she stepped off sidewalk or bike lane, then that begs the question about the car's programming and the "safety officer's" (lack of) action: a safe human driver will pay more attention to a child or someone right alongside the road in case they unexpectedly come into the road. For example, by changing lanes and/or slowing down to give the pedestrian more room.
 
...The code must just be terrible that's the only explanation I can think of, which would not surprise me one bit for a company that is essentially an online taxi company. They aren't google, Tesla, Microsoft or Apple.

It has lots of hardware but someone has to write the software to take advantage of hardware.

How do we know all the hardware were activated and used?

How do we know Uber's software is as good as Waymo's?

The government has no idea and the public has no idea. We have to depend on some disgruntled employees to leak in order to know.

Thanks @SMAlset for the link.

The police chief said it happened so fast that the pedestrian walked from the median and in shadows into a traffic lane so there's no time to react and most likely not avoidable and the driver/Uber was most likely not at fault.

The police might be correct if it's about human driver's reaction but for Waymo's software, it has predictive logarithm to detect this kind of scenario in complete darkness and even in a blind curve!
 
...If she stepped out from the center median directly into the path of the Uber car, without time for it or its driver to react...

I understand from reading and pictures that the pedestrian was walking from the left and front of the car and was hit at front right of the car. The pedestrian direction was from the center median toward the side walk and she almost cleared the front right of the car and escaped death if she was walking a little bit faster.

I think there's plenty of time for an attentive human driver to avoid hitting her but I'll have to see the video to know.

I do experience jay walkers from time to time and if I was unlucky with one second of inattentiveness, I could have collided them quite a few times!
 
I got curious what the area looked like where the accident happened so checked out Google Maps and matched the area to the photos and video from ABC15. The Uber is stopped on N Mill Avenue right before the blue signage by the park area with the multiple palm trees. From the news video, her belongings were shown behind where the vehicle stopped (and later apparently moved off to the sidewalk in front of the Uber). There is a bike lane on the right of N Mill Avenue that runs down by the river (you can rotate the view in the Google Map to see this) and heads north towards the intersection of E Curry and N Mill Ave (crosswalk there); and where the bike lane begins to approach E. Curry, it opens up to a right turn lane for cars and the bike lane continues then to the left of the turn lane at the light (so they switch positions).

Without video from the Uber to view at this point, looking at the street view of the roadway I could see a possible scenario of her riding her bike in the bike lane and then starting to move to the left where the bike lane path veers to the left a bit. The Uber could have been traveling towards E Curry in the right vehicle lane and if it was going to make a right at the light would have had to cross over the dashed lane markings to move to the far right lane at the light thereby crossing thru the bike's path. That could explain a damaged front bike tire as it turned left to the new bike lane position and explain the vehicle's damaged right side as they criss crossed paths.

I too had problems with the news reports said she was walking her bike across N Mill Ave "from the center median" to where she was struck. If that was so and she was walking along side her bike pushing it along (I would think by holding on to the handle bars), wouldn't her whole bike suffered a lot of damage or at least the rear wheel in this case and not the front wheel if she was struck by the right front of the Uber? Maybe she didn't come from the center median as reported but from the park/palm tree/signage side of the road and was walking towards the center median? A news report attributed to the Tempe police chief said there is no way the Uber "driver" could have seen her in time to stop the car, so I don't know. Right there by the sign on both sides of the road are street lights so maybe she was trying to cross there. Even if it wasn't well lit shouldn't the Uber system supposedly been able to see in the dim light a person moving a bike in front of the vehicle? One news story said that the police chief had seen the video. I have no idea how Uber's cameras/video works but we know Tesla's information takes a while to retrieve at their end.

Anyway usually when you see the street layout you can see how things could have happened but I'm just not seeing it in this case as it was described in the media. I'll be interested to read the NTSB report on how this happened but I suspect that might not be for a while.
 
Last edited:
  • Informative
Reactions: Skateboardgolf
As terrible as it is for all involved, let's not jump to conclusions before knowing all the facts. Yes, this may represent a failure of the self driving system. However, it may also have been an unavoidable circumstance that a human driver would not have been able to avoid either. We just don't know. (Unless I have missed something.) Thoughts and prayers go out to the deceased's family and friends. It would be an even bigger tragedy if this resulted somehow in the end of this research and development.

Dan
 
  • Like
Reactions: mblakele
However, it may also have been an unavoidable circumstance that a human driver would not have been able to avoid either. We just don't know.

Exactly. At this time as long as it can function like an average human driver, we should allow the testing to continue. What part of 'testing' can we not understand? If it is already better than humans then we all be commuting in driverless cars now.

It's a bad day for sloppy work!

Yes they should have dismantled sloppy NASA when Apollo astronauts died on the launch pad.
 
Certainly tragic.

It does appear that a number of sources indicate this was a difficult situation to detect and react to:

- Night time

- Person in shadows

- Not at a cross walk

- Bags/items draped on the bike making identification more difficult

- Sudden action on behalf of the pedestrian

As a result that SF Chronicle article states: "From viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said. The police have not released the videos."


The significant human driver factor is: Was the engineer/monitor onboard negligent, or did he also have difficulty reacting to the situation as a result of the above-mentioned factors?

If the latter, than that has a significant impact on the second question: Did the car do significantly worse than at least the average human would have done in the same situation?

Recall as Elon has said... the bar is not elimination of accidents, it's getting to the point where it's statistically better then the average drive and thus saves lives. The more the better, but there will always be corner cases. And... as is always the case we should learn from them, be it road design, urban lighting layout, vehicle safety systems, or autonomous driving.
 
  • Like
Reactions: landis
......
Yes they should have dismantled sloppy NASA when Apollo astronauts died on the launch pad.

All Apollo astronauts were excellently informed professionals who knew that they were taking part in a high risk enterprise and they were all volunteers. I don't believe you can claim quite the same for the general public when it comes to the testing of self driving vehicles on public roads.

I don't know about the legal situation in the US, so I don't want to comment on that. However, in Germany - where I live - there would be basically no chance for a driver to claim in court that he had "no fault" in the collision, as is surprisingly claimed by the Tempe police chief. For "no fault" the collision would have to be considered as "unavoidable" for a careful and law abiding driver, therefore the requirements are extremely high. Hitting a bicycle driver who is suddenly emerging from a hidden footpath and entering the street at 30 mph, that would most likely be "unavoidable". Hitting a middle aged woman who is pushing her bike along or across the road within city limits? That's basically no way ever "unavoidable". I do not even see anything that might be called extenuating circumstances.
What about these strange claims that the woman was obscured by shadows? A brand new vehicle with HID or LED lamps operating in a 35 mph zone, how can any pedestrian pushing a bicycle to the right of the vehicle be obscured by shadows?

Sorry, from what I can see his collision simply isn't acceptable. Apparently "self driving cars" are the current or next big "thing" for global investors and so companies are rushing systems to the market, some probably more unscrupulously than others.
 
Last edited:
We have kids (and a few adults) crossing at the median all the time here.
I always give them room or burn off speed. Why? Experience. Some pedestrians will cross when it's dangerous.

Will Uber tech learn this?
 
  • Like
Reactions: arcus
I live in an area with a ton of bicycle traffic. Most are recreational riders. The vast majority including myself (I ride too) obey the laws and ride single file and don't cause any trouble.

But there are so many other situations that are just scary. The pelotons for one. More common is the social riders who group together and chat. Its actually illegal to do so in places without a bike lane.

There is one road where there is not enough room for a car and a bicycle, so please don't socialize like this on there. It is super hard and dangerous to pass, and you need to pay attention. And then add in many blind turns and hills.

I think it will be years away before I would feel comfortable with autonomous vehicles on this road. I'm not all that comfortable with most regular drivers and cyclists on this road. It is never patrolled by law enforcement as the road crosses many jurisdictions. And it is within 10 miles of Tesla headquarters.

I think that's a bit extreme. The car had a human driver supervising, so it wasn't just a software failure.

What about the person in the car? Shouldn't he have done something? I feel awful for him too.

Not to make light or minimize her death, but have to say anyone not using a crosswalk risks getting hit by passing traffic, driverless or not. Wonder if use of a cell phone on the part of the pedestrian was involved.

I don't think this was true at all in this case. I don't think you could be pushing a bike with a bunch of stuff on it and looking at your cell phone. Sounds like the person killed may have been without a home, and likely without a cell phone.
 
There will always be fatalities involving autonomous vehicles, just as there are with human operated vehicles. There will always be fatalities that are at least partially the fault of the autonomous vehicles. There is no perfect driving or perfect programming. The standard cannot be perfection because that is impossible. As Musk has mentioned and I agree we need autonomous vehicles to be safer than humans by some measurable degree - 100%? 200%, whatever the number...they must be better than us, or it is unjustifiable to use them. There is a cost benefit analysis that has to drive the solution to this problem. Yes, it will ultimately be cheaper to have cars drive themselves than to pay humans...but that cost must also take into account the cost of damage caused by the autonomous vehicle (A.V.), not just the cost to develop, build and operate the A.V. I think since humans also have a fear of machine's performing traditional human jobs and because the tech should make this possible, the machines must do a substantially better job than us, or it isn't ready. The problem is, there are no uniform standards, barely any standards at all. In the absence of Federal standards, we are basically at the mercy of car makers and their ethical standards as to what is safe enough to be on the roads. Sure, there is the looming specter of lawsuits for harms caused, but, in the absence of baseline standards, lawsuits at best will be a patchwork of one off resolutions of problems.
 
There are about 40,000 traffic fatalities in the US every year. 90% are caused by human error.

This one will be studied intensively, and the results used to make things safer in the future.

If the lady had been struck by a normal taxi, the scrutiny would be far less intense.
 
  • Like
Reactions: Yuri_G
All Apollo astronauts were excellently informed professionals who knew that they were taking part in a high risk enterprise and they were all volunteers. I don't believe you can claim quite the same for the general public when it comes to the testing of self driving vehicles on public roads.

The first time an Amtrak train killed someone in an accident, the first time American airlines, or Delta or United had an accident and killed a bunch - they should have closed it.

That is how you improve safety. After all if there had been no Amtrak trains or no Airlines, there is 0% probability of getting killed by one of those. 100% safety achieved.
 
I'd have put thermal imaging on the Uber cars. Or the same problem will occur again and you'll see your first deer-strike death soon.
It's off-the-shelf. Our 2016 car has it as well as pedestrian-automatic-braking, front and rear.

But Uber is in a race to win the AV contest, even when it means doing unethical things. Didn't they buy a bunch of a trade secrets from Google for AV? They had to know that was a slimy shortcut.
 
I've yet to read of someone dying from a fully autonomous self-driving vehicle that didn't have a human behind the wheel. Am I wrong? I see no reason to stop live testing. Though specifically Uber needs to be schooled a bit. They likely tried to get away with testing on the cheap. In the future Uber might consider not putting convicted felons behind the wheel. Perhaps a better hire would be an individual with a certificate in Drivers Ed. Companies should also properly train and test their drivers to recognize when it's time to take the wheel. Studying those situations could lead to more rapid improvement of the technology.