You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Current Tesla collision avoidance history has been mixed.
A generic RADAR can reliably pick up everything including the bicycle in this case and the translucent human body. Detection is easy.
However, the trick is to distinguish which objects to avoid running over and which to ignore and skip braking.
Elon Musk said:
“I really consider autonomous driving a solved problem”
I can only hope that is true because with the most recent example, Tesla still slammed into a huge red fire truck at 65 MPH!
Because of the RADAR challenge so non-Tesla companies have added LIDAR while waiting for the RADAR bugs to be worked out.
If you look closely at the lane lines during the last second before impact, you can see the car veers into the pedestrian. I don’t know if this has already been posted but seems to be a very important fact.
Was it officially confirmed that the car was on AP at the time of the crash?...
This is not true.smarty, all good theory; however, our own radar system can't detect pedestrians or even stopped vehicles yet so perhaps theirs can't either. Of course theirs should have and ours should too but ours don't yet. Maybe adding FLIR to look at the infrared spectrum would help us detect the difference between a fixed object and an animate (a person or animal) or working (like the heat from the engine) object. Probably cheaper to do than LIDAR and may be just as effective.
You can search @islandbayy YouTube videos where he tested AP with someone crossing the road with/without a bike. And I think in all cases AP stopped for them. (It was during the day and not at night, though that looked fairly well lit.)
We've seen similar results. There are a lot of cyclists around my town and whenever TACC or AP detects one on the side, it will slow way down. Our cars used to slow down for signs placed in the middle of crosswalks but they are no longer bothered by them. I know the car used to slow down for pedestrians in the crosswalk but I haven't seen/tested that scenario since last summer.
We have 3 NEW tests coming around June 2nd after my Annual Tesla/EV BBQ in the Wisconsin Dells.
I don't know how closely the image of the street in HDR @Bladerskb posted (Post #154) matches the human eye, but was wondering last night if we have anyone reading this thread that lives in Tempe. Many people posting online in forums or in news story comment sections either don't realize, or seem to have a hard time accepting maybe, that dashcam video isn't anywhere as sensitive to the dark as the human eye is and they are basing their "couldn't have seen in time" assessment on that.
It would be informative and educational if someone in that area could drive to that area at same time of night and weather conditions and when at about the same spot as the car or even before see if you can see down or past the signage with the palm trees. She appeared to have crossed to the right where the bike lane ends and the dashed combo bike/turn lane begins. You can count the broken lane markings to estimate how far away the vehicle was. I do believe ones eyes can see the surroundings much better than what was recorded on the dashcam and she would have been visible from a distance. A personal experience on that road might help settle the "too dark to see or not" question for some.
What I don't get is why all these news channels, especially since they have video/camera people on staff, don't discuss this imaging issue and educate people. With self-driving/driver assist technology out there it would be a public service almost to start educating people on the technology before they get into the cars.
No, not officially. Currently, I only take the words of the driver who reported that the car was on autopilot which triggered NTSB investigation:
View attachment 288352
In the past, if a driver's autopilot claim was false, you would immediately hear a correction from Tesla.
Tesla has not come out any statement disputing the driver's autopilot claim.
When asked, Tesla issued the statement:
“Autopilot is intended for use only with a fully attentive driver”
It has been 2 months and Tesla has not denied it so it is safe to believe the driver's claim of autopilot.
Your wish is my command.
There you go.
...Sure there are a few people with really great vision and dynamic range that might be able to see into the dark area but they are the exception rather than the rule...
This thread makes me wonder if dashcam default settings are such that night images usually look much blacker than they should. That would be helpful to the owners of the dashcams, who often want to be able to argue that "the other guy came out of nowhere." On the other hand, it makes dashcam videos seem pretty unreliable as proof of anything. Quite simply, a wide modern street with streetlights will never be as dark (as viewed by the human eyes) as that street was in the Uber video. The Uber video made it look like the car was driving with WW II blackout-blinders over its headlights on a rural, unlit, two-lane road under a new moon. This was a modern car on a modern urban street.
Uber gets a ticketI believe blame is on uber... they have a primary responsibility with testing technology in a safe way. If I had time, I’d post links to the many times their autonomous cars wrecked and eventually led to getting kicked out of California. They are affecting things for everyone. Again, this is akin to a Tesla employee striking and killing a pedestrian while product TESTING autopilot on public roads. If I had time I’d link articles I remember over the last couple of years of accidents and safety issues with their system.
I struggle the idea that a Camera/Radar/Lidar system only has Walmart grade daylight quality cameras.
Sorry, it comes from UK tabloidish place, but was significant enough earlier in the thread for me to paste the link contents...
EXCLUSIVE: 'Safety driver' of self-driving Uber which killed pedestrian had string of traffic offenses as well as a felony - but was given OK by Uber to be part of high-profile pilot scheme
- Rafaela Vasquez was the 'safety driver' of the autonomous Uber that hit and killed Elaine Herzburg, 49, in Tempe, Arizona on Sunday
- DailyMail.com can disclose Vasquez had been hit with a string of moving violations such as failing to stop at a red light and speeding, in recent years
- She was cited for driving with a suspended license in 2008 and again in 2009
- Uber applies same standard for self-driving car hires as for regular Uber drivers of no more than three minor moving driving offenses in last three years
- Vasquez was also revealed to have had felony convictions for attempted armed robbery for which she served more than three years in prison in 2001
- Uber had not disclosed her lengthy history of driving offenses in its public statements about the death
- The company issued a statement referring to its hiring policy stating, 'Everyone deserves a fair chance'
Read more: Operator of self-driving Uber had a history of traffic violations | Daily Mail Online
Follow us: @MailOnline on Twitter | DailyMail on Facebook