Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
So this truck was perpendicular coming from the opposing side? And the Autopilot drove under the trailer portion as if it were a false positive such as an overhead sign?

It was my understanding that Autopilot does not stop the vehicle for stationary objects in the road above a certain speed. Is that not the case?

Not putting any blame anywhere, but I haven't like the idea of any halfway autopilot(beta or otherwise) on roads. Not that I don't trust the hardware, I just don't trust the humans to adapt to it in a safe fashion.


The current Autopilot and collision system is not designed to recognize perpendicular or oncoming traffic reliably. It's not necessarily a hardware limitation, and maybe can be addressed in future software updates, but the fact of the matter is, do not rely on the current AP implementation to recognize complex traffic scenarios like oncoming traffic veering into your lane or a vehicle making an unsafe turn (or you making an unsafe turn).
 
  • Informative
Reactions: Lex
A design flaw that is going to cost Tesla a lot of money and tarnish their reputation for a long time to come. You simply do not release a system under these conditions. You just don't.
Are you considering the fact that this is the first fatality while using AP in the 9 months since it came out and that the technology has saved other serious accidents during that time? Also, I have no idea if this driver was using it responsibly or not but most experienced AP users know how flaky it can be and drive accordingly. I'm not saying you don't make some valid points but there is a benefit to having released AP in its current form beyond Tesla's own bank account and stock price which, by the way, is tanking right now. In any case this is such a sad situation for the guys family.
 
For those who say he would have been alive without AP, can also make the argument he wound have been alive if he had not been driving at all.

Those kind of generic accusations are meaningless.

A car by the very nature that it can go at high speeds can kill. But we all use it every day because if used correctly it can do a lot of good. So the positives far outweigh the negatives.

Same with AP. For thousands of S owners it is the best stress reducer any manufacturer has ever come up with - if used with some basic attention it can save lives and it has done so.
 
A design flaw that is going to cost Tesla a lot of money and tarnish their reputation for a long time to come. You simply do not release a system under these conditions. You just don't.

I think you need to relax and pump the breaks a bit... Both of your posts are highly inflammatory and grossly opportunistic. This was not the failure of AP, this was the failure of both the truck and Model S driver, more specifically the truck driver.

Can AP be improved over time to try to account for freak occurrences like this? Yes. But to suggest that the entire functionality should be abandoned because of one freak accident is absurd.

Tesla, and by extension TSLA, will be fine. While I'll agree it's a bit early for me to draw this conclusion, I don't see anything that would suggest the NHTSA would demand Tesla disable AP which would be the only thing that would really hurt Tesla\TSLA.

If anything this is a stark reminder to those of you who openly admit to driving hands off and not paying attention while AP is engaged that such inattentiveness could cost you your life.

Jeff
 
Mercedes' decision to go all-in with hardware seems like a very smart move. A more expensive and much more conservative and robust approach than Tesla's for sure, but the limitations of Telsa's single radar and single camera are just too great to ignore.




23-Mercedes-Benz-Intelligent-Drive.jpg


It's worth pointing out, Mercedes has had hardware very similar to this on existing vehicles, and they did not detect oncoming cars or even side-swiping collisions. Left turn assist (and oncoming car collision avoidance) is starting to be introduced in the 2017 E Class as well as the latest iteration of the Audi Q7/A4 piloted assist packages…. so there's more to it than lacking hardware.
 
That is probably not the explanation. The ultrasonics can only detect objects out to 16 ft away, and their relatively slow latency means that at highway speeds the AEB cannot come on in time to prevent a collision. It might have been activated, but too late to prevent the crash.

There are a number of reasons why the Tesla driver might have failed to notice the truck in time to prevent a collision. I do not think that the AP being activated is the reason for the crash.

@MorrisonHiker posted upthread, quote: "The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A.
When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left the roadway on the south shoulder and struck a fence."

That description makes it sound like the truck driver did not see the oncoming Tesla and turned directly in front of it. We do not know if the Tesla driver attempted to brake or if by the time he saw the truck he did not have time to apply the brakes or if he never saw the truck at all. The car's logs will show if the brakes were applied by the driver or the AEB before impact.

Whatever the reason for the crash, all Tesla AP owners should know that they have to remain alert and aware at all times while using AP. Using AP on a road with cross traffic, even a divided road, has always seemed risky to me.
the investigation by the highway patrol will be telling, think length of or lack of skid marks will tell a lot about the reaction or lack of by the driver.
 
As an engineer, I think blaming the Autopilot can't see the truck/trailer because it is white and against the bright sky sound more like an excuse to me. There is no way the entire truck/trailer is all white, and there certainly components of the car that isn't white, such as the tires. The Mobileye should have detected it visually. Also, the long-range radar isn't doing the job here. It should have detected the truck coming into your path long ago, unless the truck is also going through the intersection perpendicularly at very high speed.
whether the technology failed or not while important ignores the fact that apparently the driver was inattentive to what was happening.
 
Elon even Tweeted the video:

Elon Musk on Twitter
This really surprises me as someone on Reddit pointed put the description is says:
I actually wasn't watching that direction...

Which goes directly against...

Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

Because that video certainly put Tesla's statement that a driver could not see it into question and raises the question if the driver was actually paying attention and maybe the average driver driving down the road would have seen the truck.
 
I know that but summons relies on the same hardware and it didn't recognize the trailer without colour or bright lights being an issue, as far as I am aware. I'm just trying to figure out if Tesla's statement about colour and bright lights affecting AP is relevant here.

Wrong..... Summon uses ultrasonics only, autopilot uses camera and radar mainly and only very limited use of ultrasonics, ( side to side).
 
  • Informative
Reactions: Canuck
Are you considering the fact that this is the first fatality while using AP in the 9 months since it came out and that the technology has saved other serious accidents during that time? Also, I have no idea if this driver was using it responsibly or not but most experienced AP users know how flaky it can be and drive accordingly. I'm not saying you don't make some valid points but there is a benefit to having released AP in its current form beyond Tesla's own bank account and stock price which, by the way, is tanking right now. In any case this is such a sad situation for the guys family.

Those are all good arguments for Tesla to make with NHTSA, I look forward to seeing their reaction because undoubtedly they will need to figure out how to deal with these new technologies with other manufacturers.
 
If NHTSA is investigating this I hope the finally realize that the federal requirements for tractor trailers are woefully short of those mandated by EU authorities. The EU legislated anti-underride design features for all tractor trailers there would likely have significantly mitigated both this and the previously mentioned 'summon' tractor trailer incident. Heck, who knows, such an underride structure on this tractor trailer may even have triggered the radar to come to respond differently. Personally, if Tesla can replicate this incident I would like to see them try with both an underride and non underride protected tractor trailer.
Just thinking about the lack of underride protection on our tractor trailers scares the life out of me whenever I come up behind or overtake one on the freeway.

Get on it NHTSA!!! These tractor trailers need to come up to world class safety standards!
 
This really surprises me as someone on Reddit pointed put the description is says:


Which goes directly against...



Because that video certainly put Tesla's statement that a driver could not see it into question and raises the question if the driver was actually paying attention and maybe the average driver driving down the road would have seen the truck.

That is not video of this incident.
 
The current Autopilot and collision system is not designed to recognize perpendicular or oncoming traffic reliably. It's not necessarily a hardware limitation, and maybe can be addressed in future software updates, but the fact of the matter is, do not rely on the current AP implementation to recognize complex traffic scenarios like oncoming traffic veering into your lane or a vehicle making an unsafe turn (or you making an unsafe turn).
That was my understanding after seeing the video of a Model S/X smacking into a stationary vehicle in the road while going 50ish. So if this truck were a brick wall I don't think the outcome would have been any different without driver intervention.

To be honest, that's the kind of thing I don't like about autopilot. Yes, the hardware can function wonderfully and save lives. But the false sense of security can be dangerous in and of itself. These are human beings that are easily confused and what we're dropping in their laps is an entirely new driving dynamic.

Is lives saved being far greater than lives lost to new types of accidents going to be enough to keep tech rolling out in the fashion? Doesn't seem likely in the today's America.
 
  • Like
Reactions: neroden and ggnykk
I've said this from the very beginning that launching Autopilot as a "beta" software was a bad move. In fact, you can now hear Tesla's disclaimer being emphasized loud and clear in their blog post, which in my opinion is a cowardly move. I have bad news for Tesla - hiding behind a beta software disclaimer is not going to go far with NHTSA. I am saddened by what happened, but it was inevitable. The way Tesla released and implemented Autopilot was a mistake, in my humble opinion, and the NHTSA might force Tesla to pull the software which would be devastating for Tesla and Elon Musk's way of doing things.
Every time someone has a close call -- or worse -- with AP you post the same thing. Your opinion, fair enough. But Tesla has been very clear that while using AP the driver is fully responsible for the operation of their car. That is not my opinion, that is a fact. And that is what the NHSTA will conclude. In the crash being discussed, a semi-truck made a left turn in front of the Tesla driver and the driver either failed to react or was unable to react in time. This sort of tragic crash happens many many times each day in the USA and around the world.

The Tesla's emergency braking should have kicked in, but it did not. Emergency braking should depend on the forward radar, not the camera, to engage braking when it detects an object in the car's path. The driver should not have needed to see the semi, that's why emergency braking exists!
So your position clearly is that Teslas AEB should be perfect and always work in any situation and prevent crashes. You do not live in the real world.

And apparently you are not aware that the radar does not detect objects in front of the car that appear to be stationary, such as the flat side of a truck that is at a right angle to the direction of travel of the car.

(from a friend of @nikielizabeth) We're pretty certain, he had it on auto-pilot while working on his laptop and didn't see the semi that pulled out.
I hope that speculation does not in fact describe what happened. Tesla will be able to tell from the logs if either the driver or the car activated the brakes before impact.

And according to Tesla, quote "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. "

The driver, who is fully responsible for the operation of his car at all times, failed to apply the brakes. At all. He either failed to see the truck, or the truck did not see his car and turned directly in front of him leaving no time for him to even react.
 
Last edited:
  • Like
Reactions: pgiralt