Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Only for those makes and models that would drive on roads in which another car may enter the same road from a side street.

That seems to be a rare occurrence and so maybe will cut down on the applicable cars.

Uh, eliminate Mercedes and go from there. Just like Tesla the system covers most cases. The one that happened had several factors align. Look at the video where the oncoming car swerves in front of the Tesla and the Tesla swerves and brakes to avoid the accident. That performance was better than most systems out there will do. Yet again, since I remember you from another thread, you are focusing solely on Tesla when other cars have greater issues. There are tons of things I don't like about Tesla and there are plenty of areas to improve Tesla's driver assistance features but all in all the driver assistance features are quite good. Lane keep assist seems to be the best out there.
 
Year 2018 seems like a long time. A lot of life can be saved (for all car manufacturers) if LTAP can be implemented earlier.

I'm sure everyone wants it. It's an issue of cost of hardware combined with a LOT of software development. This isn't easy stuff and the quicker solutions involve things that are currently very expensive. The low cost scanning LIDAR module posted earlier may be part of the solution. However that was just introduced this year. Now design it in and write the software. You are still two years away.
 
As a truck driver, I disagree, I may be looking at this wrong, but on google maps, it appears that the Tesla was coming over a hill. If he was indeed going over 85mph. The truck would have started that turn before the Tesla was in sight. It would take a truck 4-5 seconds to complete that turn after slowing down or stopping.

How do you feel about requiring side rails? At least that would prevent what happened and allow crumple zone and air bags to have more of a protective effect.
 
When bad things happen I have always agreed with the view that blame is shareable but responsibility is not.

In this case the left turning driver had the responsibility to make the turn safely. This is a fundamental rule of driving in North America and one that, as a motorcyclist, I find the biggest risk. Whether I'm riding a motorcycle or driving a car I am always vigilant of irresponsible left turning vehicles.

The blame will be shared but the trucker was responsible, in my opinion.
 
How do you feel about requiring side rails? At least that would prevent what happened and allow crumple zone and air bags to have more of a protective effect.
I wouldn't have a problem with it, I don't know if will happen. They are just getting to a point where new truck are required to have electronic log books. The only issue with the side rails would be low ground clearance, but that is usually only an issue with older railroad crossings.
 
  • Like
Reactions: Ben W
Everyone is panicking, predicting doom and gloom. Utter nonsense. Why?

While tragic, this is one fatality over 130+ million miles. It's not symptomatic of a major issue--and given the mounting evidence that the Tesla driver may have been watching a movie (DVD player found in car, mention of Harry Potter), it seems this was all really a result of driver error.

The sky is not falling.

The statistics on AP vs airline travel and fatal accident rates is roughly the same if you assume people average 60 mph on AP. Airlines have .4 fatal accidents per 1 million hours flown (usually with many deaths per incident). Tesla has had 1 death in around 2.3 million miles driven on AP. The airline statistic works out to 1 fatal accident per 2.5 million hours flown.

They have a moral obligation to write about an accident that was conclusively caused because of AP malfunction.

Why would they want to write about an accident that may not have anything to do with AP at all?

This has been answered, but this wasn't a failure of AP, it happened because of a limitation of AP the driver was ignoring. There was a story back when cruise control was fairly of someone from another country coming to the US and renting a motor home. He got out on the interstate, put it on cruise control, and then went back to the kitchen to get a drink. When the motor home went off the road, nobody said it was a failure of the cruise control, it was the driver being an idiot.

When something is not used the way it is intended and something goes wrong, that is not a failure of the technology, it's a failure of the user.
 
As a truck driver, I disagree, I may be looking at this wrong, but on google maps, it appears that the Tesla was coming over a hill. If he was indeed going over 85mph. The truck would have started that turn before the Tesla was in sight. It would take a truck 4-5 seconds to complete that turn after slowing down or stopping.
65... he was going 65 which is the speed limit there.

Now design it in and write the software. You are still two years away.
Supposedly Mobileye EyeQ chips can handle input from LIDAR already.
 
I should post this from the Washington post since I haven't seen it here yet.

5:40 p.m.

Federal safety records show that the truck company involved in the crash that killed a motorist using self-driving technology was involved in seven citations during four traffic stops over the past two years.

Federal Motor Carrier Safety Administration records don’t identify drivers by name, but they show that the driver for the trucking company Okemah Express was ordered off the road in January after being cited by a Virginia state inspector for being on duty more than the legal limit of 14 hours in one day.

Okemah’s driver was also cited for failing to obey a traffic control device in March and an improper lane change in December. And an inspection last year found the truck’s tires were going bald.


Sixty-two-year-old Frank Baressi of Palm Harbor, Florida, is the owner of Okemah Express. The company has one truck and one driver, Baressi himself. Authorities say he was at the wheel in May when the truck collided with a Tesla Model S vehicle in “autopilot” mode.

The Latest: Truck firm in Tesla crash had safety violations
 
  • Informative
Reactions: Lex and Haddock
Exact same accident situation my sister-in-law was in. Only difference was a large pickup cut across the divided hwy turning left in front of her. (Racing to beat the oncoming traffic). Also she was driving an Avalon. She didn't make it to the brake pedal. A few years of therapy and she walks again. Car went head first into the side of the pickup. (Tesla may have provided a safer outcome).

Anyway, even full attention may not have helped in Tesla accident - human or otherwise. The timing would have been past the critical point of a saving decision. Look at the time it takes for a crossing (or turning) vehicle, which is only going a short distance, to breach your path. Now use that time to see how far you are going toward it at 65 mph - calculate the reaction time and stopping distance. Ain't gonna happen.

Truck's decision is liable.
 
  • Informative
  • Like
Reactions: mblakele and JeffK
If one is truly concerned about saving lives by delaying self driving features, your world view is in a world of hurt when you find out how much illness and death is really preventable. Focus on the real issues. For example, drug overdoses now kill more people than car accidents or guns. But there are many conditions far common that are preventable. Sad to hear some have such a narrow focus on media drama and hype.
 
Autopilot in planes and trains means following a specified course and speed, automatically making corrections for wind etc.; this does not include braking for sudden obstacles Tesla's AutoPilot (a combination of AutoSteering and TACC, not the same thing) includes braking for common obstacles and situations, but cannot replace the driver's judgment for other situations like a truck making a turn against approaching traffic.
That said, I too have noticed that vehicles with empty space can be a problem, as in using Autopark with a jacked-up truck in front of the parking space.

Airliners and many private planes (at least the more expensive ones) have a system called TCAS in which planes communicate with each other in crowded air space and can decide among themselves how to avoid one another and take evasive action without pilot interference. It's been standard equipment since the early 1990s.

The air is generally less crowded than on a highway and even around larger airports (anything more busy than a rural strip that has a few private planes a day) you have ground control telling the planes where to be to avoid one another so TCAS doesn't kick in very often, but it has prevented crashes.

What preliminary report? They just opened the investigation three days ago...

I believe what started the media firestorm was a preliminary report by the NHTSA on Wednesday or Thursday that said the accident had happened and the investigation was opened. Before that the media was dead silent about this accident. It didn't even blip on any local person's blog as far as I know.
 
IMHO Epic Fail on driver's part, putting too much faith in new tecnology, for believing his auto-steering and cruise control added up to auto-pilot. We are not even close to full autonomous driving, hence being at level 2 when level 5 is full. We may get to 3 or 4 over the next 5 to 10 years, but only if we prevent the current group of owners from acting irresponsibly.

Sadly, it was bound to happen. Just because you can do something, doesn't mean you should trust it with your life. Keeping one's hands on the wheel and being aware of the surrounding situation at all times while in auto-driving mode is part of acknowledging the technology still has many limitations in it's current "Beta" state. Unfortunately, there are those who will believe the technology can do far more and fail to read the fine print. Used properly, like any piece of technology, it can be beneficial to all humanity, however used inproperly and it will lead to fatal consequences. You wouldn't let a five year old handle a loaded unlocked firearm would you? So you should not take your eyes off the road, even in a Tesla.

This video gives greater details into the incident in florida back on may 7th.
1st Tesla Driver Killed In Crash While Using Autopilot; NHTSA Investigating
 
I believe what started the media firestorm was a preliminary report by the NHTSA on Wednesday or Thursday that said the accident had happened and the investigation was opened. Before that the media was dead silent about this accident. It didn't even blip on any local person's blog as far as I know.

If we're calling that a "preliminary report" then yes that happened on the 28th. I cited the PDF on an earlier post and it's also available from the NHTSA website so we can check their progress.
 
I should post this from the Washington post since I haven't seen it here yet.



The Latest: Truck firm in Tesla crash had safety violations

While I agree fault lies with both drivers, but that was not the sole contributing factor in this crash, the driver of the tesla should have known better. He could have prevented the accident had he been paying attention to the situation the autopilot was steering into, anticipating the truck turning and slow down. Reaction time when not paying attention is between 5-8 seconds, whereas reaction time while driving manually is typically 2-3 seconds, that difference might have just saved the Tesla drivers life, sadly we will never know exactly what he was doing at the time of the incident.