Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Are you people all blind or just such big Tesla Fanboys that the first thing you write has to be negative and questioning a company?

Who pays them? Really? Who pays any research company? Well, obviously the people and companies that let them do statistical work for them.
 
  • Disagree
Reactions: imherkimer
Who are these people, are they legit, and who funds them? Not much info on their website, that I can see.

According to the Forbes article, the guy who runs this outfit (Randy Whitfield) often works with plaintiff's attorneys. Striking Assertion: Feds Say Tesla Autopilot Reduces Crashes, But Won't Provide Data

Edit: a quick Google search pulled up an opinion of a judge in Pennsylvania who is very critical of Whitfield's selective use of data in an expert report he submitted in a class action. http://www.courts.phila.gov/PDF/opinions/civiltrial/990603235.pdf If this is representative of his work, he sounds like a hack.
 
Last edited:
  • Like
Reactions: bhzmark and McRat
According to the Forbes article, the guy often works with plaintiff's attorneys. Striking Assertion: Feds Say Tesla Autopilot Reduces Crashes, But Won't Provide Data

Neither of the two owners has any formal statistical, mathematical, or engineering education or experience. Very limited work experience, none of it in the automotive industry. Cross examination in court would produce an epic fail when the issue of their education or experience is brought up. I've been on the stand as an expert witness for automotive products, and it's the first thing you get cross-examined on.
 
This was just posted on the Tesla forum

Major Auto Pilot Safety Issue
Submitted by Meliscu on June 29, 2017
All,

I have (had) a 2015 P90 D. It is now totalled and I wanted to share my experience with auto-pilot.

Many of you are aware of the death that occurred last year when a Tesla drove underneath a semi. It was explained as having to do with the angle of the car on a hill, the location of the sun . . . the car seeing under the truck . . .

Well, I have recently had two experiences, both called into Tesla, where the sensors did not detect enormous trucks that were stationary, perpendicular to the lane of travel, and entirely blocking the road. In the first instance autopilot would have driven me directly into a garbage truck. A GARBAGE truck! Autopilot simply did not realize it should react to a garbage truck blocking the road. In the second instance a box truck misjudged its turning radius and stopped perpendicular to my lane. Again, autopilot did not recognize that it should react to this enormous obstacle, and drove straight into the cab of the truck (almost, I reacted at the last moment and managed to avoid the cab, driving into the cargo area instead).

Yes, as the driver of the vehicle we maintain responsibility for what our vehicles are doing. That said, we place some degree of confidence that the software that powers our vehicles is generally in good order and able to perform in a basic sense . . . which is to say recognize major obstacles and slow to a stop.

If you have or use autopilot, please be aware that it appears to have a major flaw. It DOES NOT SEE perpendicular trucks blocking the lane of travel. Be careful. Do not put any reliance on the vehicle to be aware of these types of risks. I did. My vehicle is now totalled.
 
FACT #1: autopilot and nearly all AEB systems were trained to recognize the rear ends of vehicles.

Well, I have recently had two experiences, both called into Tesla, where the sensors did not detect enormous trucks that were stationary, perpendicular to the lane of travel, and entirely blocking the road.
See fact 1.

and stopped perpendicular to my lane.
See fact 1.

You don't need to call these into Tesla because the system was behaving as it was designed.

You should post my response... I'm not a member.
 
Wow, so that person had one event where the car didn't see a stopped truck, then hit another truck later, apparently still overly trusting AP. Fool me once....

I did finally get an AP1 loaner over the weekend. I loved TACC, but Autosteer was pretty squirrelly even on my straight well marked freeways to/from work. Hunting for the lane center over very gradual crests, diving for exits, etc. I had to take over a couple times when the car wandered closer to an adjacent vehicle than I was comfortable with. It amazes me that anyone is comfortable enough with it to take their hands off the wheel let alone eyes off the road.
 
I believe Tesla has in the warnings about Autopilot 1that it should not be used on roads with grade level crossings. Autopilot does not see crossing traffic, among other things. I would never expect it to see an object crosswise because of this limitation. It does not see oncoming traffic either. This is why the disclaimer is in the manual.
 
  • Like
Reactions: bhzmark and JeffK
Wow, so that person had one event where the car didn't see a stopped truck, then hit another truck later, apparently still overly trusting AP. Fool me once....

I did finally get an AP1 loaner over the weekend. I loved TACC, but Autosteer was pretty squirrelly even on my straight well marked freeways to/from work. Hunting for the lane center over very gradual crests, diving for exits, etc. I had to take over a couple times when the car wandered closer to an adjacent vehicle than I was comfortable with. It amazes me that anyone is comfortable enough with it to take their hands off the wheel let alone eyes off the road.


May be you are driving a Mercedes DrivePilot? because my S in 25k+ miles of AP1 driving hasn't done any of that, except on cresting it has difficulty, which is to be expected anyway. In any section of the roadway, if there are lane markings that it can see properly (not bots, or markings that were erased and redone in construction zones etc..), I can actually go to sleep if the damn thing doesn't nag me.. it is that damn good.
 
I have had adaptive cruise since 2006 and gone through many iterations/generations of that, so I agree certain stopped and especially perpendicular traffic is outside the abilties of these Level 2 systems. Both the radar technology in general as well as EyeQ3 based camera systems, very common in current generation, are not optimized for that scenario. Specifically EyeQ3 does not support it, even though it supports many things with high degree of confidence.

I understand how that might not make sense to a layman, but that particular limitation is not something I would blame Tesla for. That is something the driver needs to keep an eye out for as it is inherent for this stage of technology.

That said, two things:

Would some other car on the market have a better automatic emergency braking? Possibly.

But the bigger problem with Autopilot - that MobilEye picked on too - is that unlike many other driving aids, especially the earlier iterations of Autopilot could lull you into a different belief by how they could drive the car for extended periods without driver attention. A system with such a limitation is simply dangerous when doing that, as evidenced by the fatal AP1 case. I think forcing the driver to keep an eye out for the road in some manner is, thus, paramount with such a limited system. (I wish there was some other way than having to yank the steering wheel at times, though.)
 
The NHTSA's claim is ridiculous, that's why they won't release the data.
Absolutely not ridiculous. NHTSA study mirrors the IIHS findings which also included nearly every other auto manufacturer which had AEB systems.
Front crash prevention cuts rear-enders

The reduction by 40% was for front end collisions. Tesla is using the same technology for AEB, but newer than what's referenced in the IIHS study.
 
Last edited:
May be you are driving a Mercedes DrivePilot? because my S in 25k+ miles of AP1 driving hasn't done any of that, except on cresting it has difficulty, which is to be expected anyway. In any section of the roadway, if there are lane markings that it can see properly (not bots, or markings that were erased and redone in construction zones etc..), I can actually go to sleep if the damn thing doesn't nag me.. it is that damn good.

Well, you must have a different firmware than the P85D loaner that I had. I have seen complaints here that AP1 has been getting worse not better in the later releases.

Here is the road I was using it on. I thought this would be a good enough road for it, but maybe I was wrong. This is my daily commute

Screenshot_20170630-101048.png


Either way, I came to the conclusion that I probably won't be using a whole lot of Autosteer when we get our P85D delivered. Too stressful for me, personally.
 
I believe Tesla has in the warnings about Autopilot 1that it should not be used on roads with grade level crossings. Autopilot does not see crossing traffic, among other things. I would never expect it to see an object crosswise because of this limitation. It does not see oncoming traffic either. This is why the disclaimer is in the manual.

I read through the entire manual in the car before using either AP or TACC on that loaner. I did not see any specific warnings about roads with grade level crossings. It says you can use Autosteer on highways AND limited access roads. The "and" is key, since I would consider a road a highway if it was a road like the one the Florida crash occurred on. Highway 84 in TX is the example I drove the most on. Entrance and exit ramps and overpasses, but still grade level crossings as well. It's a pretty common configuration in TX.

TACC does warn that it can't see stopped vehicles especially when going over 50mph.

20170625_145452.jpg


20170625_145251.jpg
 
The NHTSA's claim is ridiculous, that's why they won't release the data.

Most FCW/AEB systems reduce rear end crashes by about 40%. When these findings were first released, I remember quite a few Tesla fans being upset/confused that Tesla's system (which at the time was one of the best around) was only at par with accident reduction rates of the other systems. Not sure why these lawyers think that 40% is outrageous or BS, when it is perfectly in line with other manufacturers' systems.
 
Most FCW/AEB systems reduce rear end crashes by about 40%. When these findings were first released, I remember quite a few Tesla fans being upset/confused that Tesla's system (which at the time was one of the best around) was only at par with accident reduction rates of the other systems. Not sure why these lawyers think that 40% is outrageous or BS, when it is perfectly in line with other manufacturers' systems.

The point you are making above is incorrect, although the mistake is understandable since the same misinformation gets repeated over and over again.

The study that the NHTSA relied on to show that AEB plus FCW reduced a driver's chances of rear ending another car by 40% actually found no statistical difference in overall accidents with AEB and FCW. The 40% number in that context was only one type of rear-end collision. It excluded all accidents that are not rear-end collisions and only included rear-end collisions where the driver rear-ended another car -- excluding accidents where the driver with FCW/AEB was rear-ended.

In fact, the study found that there was only a 6% decrease in overall accidents with AEB/FCW, which was so small it did not even reach the level of statistical significance. http://orfe.princeton.edu/~alaink/S...s/IIHS-CicchinoEffectivenessOfCWS-Jan2016.pdf (see pages 1 and 15.)

In contrast, the 40% reduction in airbag deploying accidents once Tesla enabled AP was a massive reduction.

Comparing rear end collisions to total collisions is apples-to-oranges -- totally inaccurate and confusing.

Side note: It turns out that when incorrect information like this gets repeated, people tend to believe it even when it is totally wrong. This is known as the illusory truth effect. Illusory truth effect - Wikipedia
 
Last edited:
  • Like
Reactions: bhzmark
This was just posted on the Tesla forum

Major Auto Pilot Safety Issue
Submitted by Meliscu on June 29, 2017
All,

I have (had) a 2015 P90 D. It is now totalled and I wanted to share my experience with auto-pilot.

Many of you are aware of the death that occurred last year when a Tesla drove underneath a semi. It was explained as having to do with the angle of the car on a hill, the location of the sun . . . the car seeing under the truck . . .

Well, I have recently had two experiences, both called into Tesla, where the sensors did not detect enormous trucks that were stationary, perpendicular to the lane of travel, and entirely blocking the road. In the first instance autopilot would have driven me directly into a garbage truck. A GARBAGE truck! Autopilot simply did not realize it should react to a garbage truck blocking the road. In the second instance a box truck misjudged its turning radius and stopped perpendicular to my lane. Again, autopilot did not recognize that it should react to this enormous obstacle, and drove straight into the cab of the truck (almost, I reacted at the last moment and managed to avoid the cab, driving into the cargo area instead).

Yes, as the driver of the vehicle we maintain responsibility for what our vehicles are doing. That said, we place some degree of confidence that the software that powers our vehicles is generally in good order and able to perform in a basic sense . . . which is to say recognize major obstacles and slow to a stop.

If you have or use autopilot, please be aware that it appears to have a major flaw. It DOES NOT SEE perpendicular trucks blocking the lane of travel. Be careful. Do not put any reliance on the vehicle to be aware of these types of risks. I did. My vehicle is now totalled.
People like this are the reason we can't have nice things.