Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
That's just sad. Condolences all the way around.

That said, I have a hard time correlating the accident in any way to AP other than with regard to the fact that AP was in use.

I still firmly believe that AP should have been called DriverAssist, but that ship has long since sailed. Mariners and aviators know what AP are and are not, but the general public? Not so much.

My primary point is if the driver had (just) cruise control engaged, would the headlines read "Cruise control engaged..."? Maybe back in the day when cruise control was a new thing.

I'm not going to blame the shorts for fomenting hysteria, but I'm also going to reserve judgment until after the accident investigation is complete. Those perpendicular divided highway crossings are fraught with peril to begin with.

Again, condolences to his family and friends, whom I'm sure at the moment couldn't care less about the technology and stock price.
 
So sad. My condolences to his family and loved ones.

I have a difficulty with this part of Tesla's press release:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."


The problems I have with this statement are:

(1) Tesla's autopilot is not affected by colours or bright lights, as far as I am aware. Tesla's system simply does not have the hardware to address this type of situation regardless of colours or lighting.

(2) We don't know why the driver didn't react but I don't believe that it was due to the colour of the trailer and/or the brightly lit sky. I've driven a lot of highway miles beside semi-trailers and regardless of colour or the sun in your eyes, you know when one is beside you or moving into your lane.

I only had an autopilot loaner for less than a week of highway driving but I can see how this can possibly happen by people becoming complacent and too reliant on the system. But I don't know if that happened here. No one knows and the last thing I want to do is blame the driver. Of course, the truck driver is at fault but at the same I time I wonder if he would still be alive if he had a non-AP car. Then again, AP has probably saved lives that we will never know about -- tired drivers, lane keeping, and even in the video posted by this driver of his close call that AP avoided.

Good post. What's your gut feel on any possible litigation vs. Tesla from this?
 
So sad. My condolences to his family and loved ones.

I have a difficulty with this part of Tesla's press release:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."


The problems I have with this statement are:

(1) Tesla's autopilot is not affected by colours or bright lights, as far as I am aware. Tesla's system simply does not have the hardware to address this type of situation regardless of colours or lighting.

(2) We don't know why the driver didn't react but I don't believe that it was due to the colour of the trailer and/or the brightly lit sky. I've driven a lot of highway miles beside semi-trailers and regardless of colour or the sun in your eyes, you know when one is beside you or moving into your lane.

I only had an autopilot loaner for less than a week of highway driving but I can see how this can possibly happen by people becoming complacent and too reliant on the system. But I don't know if that happened here. No one knows and the last thing I want to do is blame the driver. Of course, the truck driver is at fault but at the same I time I wonder if he would still be alive if he had a non-AP car. Then again, AP has probably saved lives that we will never know about -- tired drivers, lane keeping, and even in the video posted by this driver of his close call that AP avoided.
#1 isn't really consistent with Elon's tweet re road signs. Like you I have a hard time imaging a scenario like #2 and I've been using AP since day one but again this is all speculation because we weren't there.
 
I believe LIDAR that Google uses would prevent this kind of accident.

It can see two football fields away in all directions.

"All directions" means low and high including empty tall high bed trailers.

One of my other hobbies is electronic warfare with police LIDAR systems and trust me, LIDAR has some advantages but is not impervious. Extremely bright sunlight especially at certain angles can produce a lot of LIDAR interference that's really challenging to overcome. The Google self driving car'S Velodyne unit is really good at least in part because of $10,000+ in optics and lenses. It's unfair to compare that to a production priced webcam-grade camera.... that is, putting that kind of optics on a visual camera may make it exceed human vision capabilities too.
 
  • Informative
Reactions: Genshi and ggnykk
...or in other words would the Tesla driver have crashed in to/under the semi had he not been using AP assistance?

It's possible that owners without Autopilot would be more vigilant because they know the limitations of manual driving.

Co-founder of Google’s Deep Learning project and online learning startup, Coursera. He is also a professor at Stanford’s computer science and electrical engineering departments has been critical of Tesla Autopilot:





However, as the blog point outs, statistically, Tesla Autopilot has a less fatality rate per mile than among all vehicle miles.
 
  • Informative
Reactions: whitex
Wow, can't believe it's the same guy from the video. Tragic. :(

Having used AutoPilot for several days, it was enough to make me want to purchase it with my Model 3.

That being said, I did follow a few trucks and noticed the trucks being displayed in the car. I guess it only detects what's directly in front of you at a certain height and width? Is a truck defined more by its width? Of course the accident is a different scenario.
 
So sad. My condolences to his family and loved ones.

I have difficulty with this part of Tesla's press release:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."


The problems I have with this statement are:

(1) Tesla's autopilot is not affected by colours or bright lights, as far as I am aware. Tesla's system simply does not have the hardware to address this type of situation regardless of colours or lighting.
[...]
.

#1 situation would render mobileye camera unable to see due to low contrast image
 
  • Informative
Reactions: Canuck
#1 situation would render mobileye camera unable to see due to low contrast image

I didn't know that. But doesn't the height of the camera, and the fact that there's only one camera, make this a moot point, for the same reason as this accident?:

model-s-wummon-accident.png


A fatal Tesla Autopilot accident prompts an evaluation by NHTSA
 
  • Like
Reactions: alseTrick
Official Tesla press release (A Tragic Loss)

A Tragic Loss
The Tesla Team June 30, 2016
We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.

As an engineer, I think blaming the Autopilot can't see the truck/trailer because it is white and against the bright sky sound more like an excuse to me. There is no way the entire truck/trailer is all white, and there certainly components of the car that isn't white, such as the tires. The Mobileye should have detected it visually. Also, the long-range radar isn't doing the job here. It should have detected the truck coming into your path long ago, unless the truck is also going through the intersection perpendicularly at very high speed.
 
As an engineer, I think blaming the Autopilot can't see the truck/trailer because it is white and against the bright sky sound more like an excuse to me. There is no way the entire truck/trailer is all white, and there certainly components of the car that isn't white, such as the tires. The Mobileye should have detected it visually. Also, the long-range radar isn't doing the job here. It should have detected the truck coming into your path long ago, unless the truck is also going through the intersection perpendicularly at very high speed.
See Elon's tweet upthread. I think he's saying it doesn't alert or take action on it due to false positives with overhead road signs.
 
...The Google self driving car'S Velodyne unit is really good at least in part because of $10,000+ in optics and lenses. It's unfair to compare that to a production priced webcam-grade camera.... that is, putting that kind of optics on a visual camera may make it exceed human vision capabilities too.

It's a very sad state of technology to blame the problem on the color of the truck.

At least, with LIDAR, you don't have to worry about any color at all.

googlecar04.gif


However, I agree that it is not a fair comparison to compare very expensive Google's LIDAR with Tesla $2,500 Autopilot.

It seems that there's a better technology but owners don't demand for it because of either the lack of knowledge or the lack of the will to pay more.