Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
If you look closely at the lane lines during the last second before impact, you can see the car veers into the pedestrian. I don’t know if this has already been posted but seems to be a very important fact.
 
Current Tesla collision avoidance history has been mixed.

A generic RADAR can reliably pick up everything including the bicycle in this case and the translucent human body. Detection is easy.

However, the trick is to distinguish which objects to avoid running over and which to ignore and skip braking.

Elon Musk said:

“I really consider autonomous driving a solved problem

I can only hope that is true because with the most recent example, Tesla still slammed into a huge red fire truck at 65 MPH!


Because of the RADAR challenge so non-Tesla companies have added LIDAR while waiting for the RADAR bugs to be worked out.


Was it officially confirmed that the car was on AP at the time of the crash? I thought a son of a friend of the drivers posted that day that his dad said his friend told him he wasn't sure if he had it on or not. Also don't recall him blaming Tesla for the accident. I'm not aware of any updates on the accident.
 
If you look closely at the lane lines during the last second before impact, you can see the car veers into the pedestrian. I don’t know if this has already been posted but seems to be a very important fact.

I thought the Uber stayed pretty much centered in the lane, a lane which did alter it's configuration as it was where the bike lane and a right turn lane started at that point in the roadway, and would switch places further ahead into separate lanes with the bike lane on the left and the turn lane on the right. I assumed that was part of the lane centering programming of the car.
 
  • Like
Reactions: mongo
Was it officially confirmed that the car was on AP at the time of the crash?...

No, not officially. Currently, I only take the words of the driver who reported that the car was on autopilot which triggered NTSB investigation:


upload_2018-3-22_11-4-49.png



In the past, if a driver's autopilot claim was false, you would immediately hear a correction from Tesla.

Tesla has not come out any statement disputing the driver's autopilot claim.

When asked, Tesla issued the statement:

“Autopilot is intended for use only with a fully attentive driver”

It has been 2 months and Tesla has not denied it so it is safe to believe the driver's claim of autopilot.
 
Last edited:
smarty, all good theory; however, our own radar system can't detect pedestrians or even stopped vehicles yet so perhaps theirs can't either. Of course theirs should have and ours should too but ours don't yet. Maybe adding FLIR to look at the infrared spectrum would help us detect the difference between a fixed object and an animate (a person or animal) or working (like the heat from the engine) object. Probably cheaper to do than LIDAR and may be just as effective.
This is not true.

Our car definitely sees and reacts to stopped vehicles. I've never tried with a pedestrian because I don't use AP where there are pedestrians.

It is a complete forum legend that the Tesla (or any other automotive) radar cannot see stopped objects. The kind of radar used in cars definitely can and does see stationary objects.
 
  • Disagree
Reactions: hacer
You can search @islandbayy YouTube videos where he tested AP with someone crossing the road with/without a bike. And I think in all cases AP stopped for them. (It was during the day and not at night, though that looked fairly well lit.)




We have 3 NEW tests coming around June 2nd after my Annual Tesla/EV BBQ in the Wisconsin Dells.
 
Let's take a look at the front camera array.

What do we see there? Oh yes, a gazillion high dynamic range cameras with very large sensors. Those sensors should be able to see pretty much everything at that light level.

Current Uber sensor array

DYw2h07WsAEo50W.jpg


ubercar-3.jpg




Older Uber sensor array (maybe better and more expensive?)

uber_self-driving_cameras-720x720.jpg


volvo-xc90-with-cameras-lasers-radar-and-gps-receivers.jpg


Note the array we see in the first two images is different than the bottom two images. It seems to be the "cheapo" reduced-cost version, but the sensors are still good enough.
 
Last edited:
  • Informative
Reactions: dhanson865




We have 3 NEW tests coming around June 2nd after my Annual Tesla/EV BBQ in the Wisconsin Dells.
We've seen similar results. There are a lot of cyclists around my town and whenever TACC or AP detects one on the side, it will slow way down. Our cars used to slow down for signs placed in the middle of crosswalks but they are no longer bothered by them. I know the car used to slow down for pedestrians in the crosswalk but I haven't seen/tested that scenario since last summer.
 
I don’t know if you are 100% correct. The radar might see the object, but it didn’t always react. Now most of the sketchy moment I had was before 2018.10.4, maybe better now. Still my point is that are we so sure our autopilot is that safe. Also I don’t think the car would have stopped if the woman had not gone inside the cars lane in the tesla system. I am doubtful the car is smart enough that a person going across on the next lane might possibly jump across
 
I don't know how closely the image of the street in HDR @Bladerskb posted (Post #154) matches the human eye, but was wondering last night if we have anyone reading this thread that lives in Tempe. Many people posting online in forums or in news story comment sections either don't realize, or seem to have a hard time accepting maybe, that dashcam video isn't anywhere as sensitive to the dark as the human eye is and they are basing their "couldn't have seen in time" assessment on that.

It would be informative and educational if someone in that area could drive to that area at same time of night and weather conditions and when at about the same spot as the car or even before see if you can see down or past the signage with the palm trees. She appeared to have crossed to the right where the bike lane ends and the dashed combo bike/turn lane begins. You can count the broken lane markings to estimate how far away the vehicle was. I do believe ones eyes can see the surroundings much better than what was recorded on the dashcam and she would have been visible from a distance. A personal experience on that road might help settle the "too dark to see or not" question for some.

What I don't get is why all these news channels, especially since they have video/camera people on staff, don't discuss this imaging issue and educate people. With self-driving/driver assist technology out there it would be a public service almost to start educating people on the technology before they get into the cars.


Your wish is my command.

There you go.

 
  • Informative
Reactions: Az_Rael and zmarty
No, not officially. Currently, I only take the words of the driver who reported that the car was on autopilot which triggered NTSB investigation:


View attachment 288352


In the past, if a driver's autopilot claim was false, you would immediately hear a correction from Tesla.

Tesla has not come out any statement disputing the driver's autopilot claim.

When asked, Tesla issued the statement:

“Autopilot is intended for use only with a fully attentive driver”

It has been 2 months and Tesla has not denied it so it is safe to believe the driver's claim of autopilot.


Ah yes, had to refresh my memory on this. To me this was an unavoidable accident from the car's autopilot standpoint and nothing like the Tempe pedestrian accident. This was about a Model 3 on autopilot who had a vehicle in front of him suddenly switch lanes in order to avoid a stopped fire truck in front of him. He himself had no time to change lanes and was following too close to take evasive action and the question was did AEB have time to activate to minimize the speed at which he struck the fire truck given the car's speed and roadway left before the truck. While there doesn't seem to be a report back yet from the investigation, I don't think anyone expects Tesla or any advanced driving system to be able to see far ahead in traffic to predict drivers ahead of them suddenly trying to avoid an accident. It will be interesting to see if the AEB was deployed and able to reduce his speed at all before impact though. I have a feeling Tesla will come out pretty well on this from the car standpoint as he emerged from it with only minor injuries. The center screen will be noted in need of improving as the glass was impacted by the passengers arm striking it during air bag deployment and it cut the passenger's arm, which Elon said would be immediately addressed.
 
Your wish is my command.

There you go.



Thanks for finding that! Saw another reference to N Hill Ave pop up while watching it and apparently another person had the same though as me and went out and filmed there last night in his car. While neither video is necessarily 100% representative of human vision, it does illustrate how different dashcams or camera setups can see way more at night than from the dashcam that was installed locally in the Uber that the police looked at. In both of these cases no doubt a person would be able to have been seen way in advance of coming up to them and taken appropriate action (horn, slow down, change lanes, etc).

 
...Sure there are a few people with really great vision and dynamic range that might be able to see into the dark area but they are the exception rather than the rule...

Now that we have other people videos, theirs can see in the shadows of streetlights fine while Uber's camera could not:

The below is from a Youtuber's clip with my circle indicating where the pedestrian was. You can clearly see the dark black asphalt road surface in the shadow between the streetlights and car's headlights. You can count 5 lanes from left to right:

CLfg2hG.jpg


With that same clip, a dash cam might automatically adjust the exposure so to mislead us that human eyes cannot see the same spot. Uber's video was so dark that you cannot count 5 lanes from left to right because the white lane markings that were not shone on by headlights were darken to invisible:

oPqicJb.jpg
 
This thread makes me wonder if dashcam default settings are such that night images usually look much blacker than they should. That would be helpful to the owners of the dashcams, who often want to be able to argue that "the other guy came out of nowhere." On the other hand, it makes dashcam videos seem pretty unreliable as proof of anything. Quite simply, a wide modern street with streetlights will never be as dark (as viewed by the human eyes) as that street was in the Uber video. The Uber video made it look like the car was driving with WW II blackout-blinders over its headlights on a rural, unlit, two-lane road under a new moon. This was a modern car on a modern urban street.
 
This thread makes me wonder if dashcam default settings are such that night images usually look much blacker than they should. That would be helpful to the owners of the dashcams, who often want to be able to argue that "the other guy came out of nowhere." On the other hand, it makes dashcam videos seem pretty unreliable as proof of anything. Quite simply, a wide modern street with streetlights will never be as dark (as viewed by the human eyes) as that street was in the Uber video. The Uber video made it look like the car was driving with WW II blackout-blinders over its headlights on a rural, unlit, two-lane road under a new moon. This was a modern car on a modern urban street.

I struggle the idea that a Camera/Radar/Lidar system only has Walmart grade daylight quality cameras.
The history of Uber and honesty is weak at best.

We have no streetlights here on many of our roads. That video is darker than our roads are on a moon-less night with low-beams.
 
I believe blame is on uber... they have a primary responsibility with testing technology in a safe way. If I had time, I’d post links to the many times their autonomous cars wrecked and eventually led to getting kicked out of California. They are affecting things for everyone. Again, this is akin to a Tesla employee striking and killing a pedestrian while product TESTING autopilot on public roads. If I had time I’d link articles I remember over the last couple of years of accidents and safety issues with their system.
Uber gets a ticket
In December 2016, Uber's computer-controlled car was caught on video running a red light, four seconds after the light turned red. Uber said that the violation was a human error since a person was required to sit behind the wheel. The company suspended the driver after California regulators ordered a rollback of Uber’s self-driving cars citing pedestrian safety.
9DE916BE-68CC-4704-B92A-01541939F27A.jpeg


58D3AAFE-81A8-4C56-A55D-86D705C56CB3.jpeg
 
Sorry, it comes from UK tabloidish place, but was significant enough earlier in the thread for me to paste the link contents...
EXCLUSIVE: 'Safety driver' of self-driving Uber which killed pedestrian had string of traffic offenses as well as a felony - but was given OK by Uber to be part of high-profile pilot scheme
  • Rafaela Vasquez was the 'safety driver' of the autonomous Uber that hit and killed Elaine Herzburg, 49, in Tempe, Arizona on Sunday
  • DailyMail.com can disclose Vasquez had been hit with a string of moving violations such as failing to stop at a red light and speeding, in recent years
  • She was cited for driving with a suspended license in 2008 and again in 2009
  • Uber applies same standard for self-driving car hires as for regular Uber drivers of no more than three minor moving driving offenses in last three years
  • Vasquez was also revealed to have had felony convictions for attempted armed robbery for which she served more than three years in prison in 2001
  • Uber had not disclosed her lengthy history of driving offenses in its public statements about the death
  • The company issued a statement referring to its hiring policy stating, 'Everyone deserves a fair chance'


Read more: Operator of self-driving Uber had a history of traffic violations | Daily Mail Online
Follow us: @MailOnline on Twitter | DailyMail on Facebook
 
Sorry, it comes from UK tabloidish place, but was significant enough earlier in the thread for me to paste the link contents...
EXCLUSIVE: 'Safety driver' of self-driving Uber which killed pedestrian had string of traffic offenses as well as a felony - but was given OK by Uber to be part of high-profile pilot scheme
  • Rafaela Vasquez was the 'safety driver' of the autonomous Uber that hit and killed Elaine Herzburg, 49, in Tempe, Arizona on Sunday
  • DailyMail.com can disclose Vasquez had been hit with a string of moving violations such as failing to stop at a red light and speeding, in recent years
  • She was cited for driving with a suspended license in 2008 and again in 2009
  • Uber applies same standard for self-driving car hires as for regular Uber drivers of no more than three minor moving driving offenses in last three years
  • Vasquez was also revealed to have had felony convictions for attempted armed robbery for which she served more than three years in prison in 2001
  • Uber had not disclosed her lengthy history of driving offenses in its public statements about the death
  • The company issued a statement referring to its hiring policy stating, 'Everyone deserves a fair chance'


Read more: Operator of self-driving Uber had a history of traffic violations | Daily Mail Online
Follow us: @MailOnline on Twitter | DailyMail on Facebook

Uber said they will give anyone a second chance! As long as they don't have to pay them very much.