Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla's Autopilot needs to be shut down and NHTSA needs to do their due diligence

This site may earn commission on affiliate links.
@imisphgh , according to Elon, Tesla's data -- which was provided to the NHTSA -- shows that AP equipped cars are significantly safer than non-AP cars. You are welcome to question his statement. No one is claiming that AP or EAP is perfect. It will improve over time (as is obvious by what has happened over the past year). AP does appear to add safety to the vehicle.
 
Have you seen the "Drunk tesla" video? That new version is not a net positive.

Also my point is that lives are needlessly being pout at risk by the process. Change that at all is well. If I cure cancer but through the process give you lesser contracted diseases that kill you , and it was ALL AVOIDABLE, have I done my due diligence? If a family member died and you learned it was avoidable would you feel good about them being sacrificed for the greater good?

I don't need a drunk tesla video. I had my drunk AP1 for almost a year now :)

I see your point, but if the first step in curing cancer is getting you a cold that you can prevent by taking precautions (which is equal to holding the steering wheel and making sure the car doesn't do something stupid), then I'm all for it. But the key here is I am all for it. But somebody might jump in a Tesla where somebody else agreed to the same T&C's and unknowingly cause an accident (though message on dash reminds you)...
 
But can you expect every (and not being racist or nationalist) Chinese, Indian, Malaysian, etc owners to read the entire manual...? I know you can expect a fighter pilot to know every knob on their plane but can you expect the same with the driver (and not necessarily the owner of the car). Can you expect a person who rents a Tesla on Turo to have the same mindset about Autopilot than the owner? This is not only about owners and fans of the technology...

Of course Tesla can expect every owner -- wherever they are located and whatever nationality they are -- to read the manual, and when Tesla sells a car they should advise that. I am sure that in the countries that Tesla sells in the manual is written in the national language.

You actually don't have to read the entire manual. There is a warning message you have to click through before it allows autosteer usage. And as long as you heed to that message and use it as the assistive feature that it is, then there really isn't a safety issue.
 
I came here to try to be helpful. If my approach was safer and faster shouldn't that matter? If the current process is counter-productive how is that helpful? Wouldn't it be more productive to directly address my points rather than shoot the messenger etc?How is my suggested approach not the better one? If folks really care about the technology and people's well being wouldn't the fastest and safest method be in everyone's best interest? I am not an autopilot hater.

(Note- When SpaceX had its first code base reviewed by NASA they rejected it. There were no defects, which is not possible if something that large and complex was properly tested, and there was poor exception handling. Same thing here)
A deliberately abrasive first post is not you "trying to be helpful". You came here with one intent: to spread FUD.
 
"Tesla's Autopilot needs to be shut down and NHTSA needs to do their due diligence"

What makes you think Tesla is not doing their due diligence? Lets see,
  1. They make it quite clear that the onus is you, the driver
  2. They require you to have hands on the wheel, all the time
  3. They are late, to the much invited irk of AP2 owners, and yet they are rolling it out as a delicate balance between safety and reliability.
  4. And how the hell do you propose we get 6bn miles of road experience, without rolling it out?
The safest place for a bird is in a nest. But that is not what birds are designed to do. IMO, let her fly! Educate the owners. If the owner does not read the manual (which is not like a 6pt light gray credit card agreement), or ignores the "one liner" OK button you have to hit to enable AP, it's the owner's fault.

Question for OP - do you own an AP2 car? What prompted you to post this incendiary headline as literally your first post?
 
Hmm.. A 1000 posts doesn't make anyone right and with that assumption, one day members can have very useful comments or thoughts.

The knee jerk part is a different topic. How do you know it's a knee jerk reaction or OP has been contemplating for months, maybe he owns a Tesla with AP?

Bear in mind, i'm not saying he is right or wrong. But shooting down somebody's opinion this way is not constructive in my opinion.
I know FUD when I see it.
 
Hang on a second - I get the whole not trusting autonomous driving and all that, but justifying a your point by using an accident for which there was no data or evidence linking it to autonomous driving is plain bad science and poor analysis.
On that basis, every car manufacturer should closely examine every single accident involving their cars just in case the driver might have engaged something in the car that caused the accident.
The fire is a completely unconnected with your calls for disabling driver assistance features.
That just makes your arguments sound like the guy who tries to prove issues with wheels falling off by using salvage pics in bogus safety complaints.

I didn't justify any points using the McCarthy/Speckman tragedy. I asked for there to be an investigation. And if you check my LinkedIn article on that I listed questions to ask. (I do however find the no data thing troubling). It is very possible there is no there - there. I am asking for due diligence to be done through a proper investigation.

Yes the fire is separate from autopilot issues.
 
Also my point is that lives are needlessly being pout at risk by the process. Change that at all is well. If I cure cancer but through the process give you lesser contracted diseases that kill you , and it was ALL AVOIDABLE, have I done my due diligence? If a family member died and you learned it was avoidable would you feel good about them being sacrificed for the greater good?
the problem is your initial post used only conjecture to make claims without basis.
There is no evidence to prove anything you make sensational accusations for.
Where is the evidence that shows NHTSA have not done due diligence?
Where is the evidence that Tesla are not handing over telemetry data?
Without evidence all of this is wild conjecture with no basis in fact.
 
You actually don't have to read the entire manual. There is a warning message you have to click through before it allows autosteer usage. And as long as you heed to that message and use it as the assistive feature that it is, then there really isn't a safety issue.

There is a warning message that you click through as an owner (or anyone else who enables the technology in the first place). The next driver doesn't..
 
BTW to clarify my stance, when a rocket takes off in Florida, it potentially could explode and take out a chunk of Florida with it. There is a reason why those launchpads are built where they are. They may be safe today, but when the first rocket went to the moon, it packed the punch of a hydrogen bomb in a machine built by the lowest bidder, with really no way to test it.

If we never took risks, we would never be on the moon.

Seriously, I am willing and more than happy to take these risks, if it means
  • when I am old, and have poor eyesight, that I don't loose my freedom.
  • that we can have a car sharing economy
  • that we don't trust our lives on on texting and driving idiot's judgement
  • that we gain 2 extra hours a day in our lives
  • that we don't have to dedicate large swaths of prime real estate to parking
  • or spend time finding parking, or fighting the parking attendant who wants your keys
  • that we have immense defense implications that keep your boys and girls safe
  • and I don't have to spend teaching my 16 year old how to drive.
  • and I can say, "Go home car, I'm drunk".
And AP2 is growing up, but it ain't that bad. Seriously get over it.

PS: Robot overlords, if ur reading this, please kill me last. TIA.
 
Thank you for the question. I have commented in many places. I am not favoring anyone. If you look at my articles on LinkedIn you will see most of the text is aimed at the industry as a whole. I do however realize I post mostly on Tesla. The reason for that is mostly due to press coverage and how egregious I believe Tesla's approach to be. That includes Elon Musk's video where he clearly takes his hands off the wheel and acts exactly how he says his customers should not act in his cars. I assure you that if and when articles etc come up on those other companies I will post.
You are being quite hypocritical because there are more driver assist Mercedes than AP Tesla's just based on sheer worldwide sales of S and E class, and yet you admit most of your articles target Tesla because of the "media coverage". Why should you care about media coverage? If you actually cared, shouldn't the actual number of people using this tech be your real concern? Then most of your articles should be targeting Mercedes not Tesla. Very hypocritical.
 
I didn't justify any points using the McCarthy/Speckman tragedy. I asked for there to be an investigation. And if you check my LinkedIn article on that I listed questions to ask. (I do however find the no data thing troubling). It is very possible there is no there - there. I am asking for due diligence to be done through a proper investigation.

Yes the fire is separate from autopilot issues.
Then I'm confused, you call for autopilot to be withdrawn and an investigation into accidents where autopilot was in use - then call for an investigation into an accident just in case autopilot was in use.
Which is it?
The first one happens now - so no need to call for something that already happens
or
We need to investigate every accident of a car that has autopilot fitted just in case there might be an issue.

The first one is reasonable - which is why it's already happening
The second is not reasonable
 
Hi everyone, did anyone notices that there wasn't two line in this video? I want to point out two things!

1. AP used two line to navigate!!!
2. Where the OP driving didn't have two line!

Trying to use AP where there isn't to line, is crazy: you should expect the car to try and fine the lines!!!
 
Hi everyone, did anyone notices that there wasn't two line in this video? I want to point out two things!

1. AP used two line to navigate!!!
2. Where the OP driving didn't have two line!

Trying to use AP where there isn't to line, is crazy: you should expect the car to try and fine the lines!!!

Turn that around. If there isn't 2 lines and AP NEEDS 2 lines, why does it let you activate it?
 
But can you expect every (and not being racist or nationalist) Chinese, Indian, Malaysian, etc owners to read the entire manual...?

not the whole thing - i'm sure everyone understands how windshield wipers work and so on, but i certainly expect anyone to read the manual on a piece of technology they've never used before, particularly if it's something as inherently dangerous as driving an automobile. if then if they can't be bothered they don't deserve to have a driver's license.

You are not actually addressing my points. Shooting the messenger is not an objective nor helpful response.

i'm not shooting the messenger, i'm shooting down your premise.

if the car begins to do something anomalous, the driver who is paying attention and has a hand on the wheel has zero difficulty taking over. engaging autopilot does NOT disavow the driver of responsibility for operating the vehicle.
 
  • Like
Reactions: pchilds
http://www.thedrive.com/news/7915/watch-this-tesla-autopilot-2-0-fail-terribly-in-a-model-s

That video shows the car making a quick move across the oncoming lane. It is beyond clear that Tesla's design and testing approach is reckless. Imagine if the car was going faster. How does this not get caught in simulation or using simulators? How does this not get caught on test tracks? Using your customers as Guinea pigs is bad enough but now you are using them to check for massive system regression? This video clearly shows that these cars regressed so far that Tesla's entire process needs to be investigated. Especially around regression testing.

NHTSA needs to quickly reverse their stance on Tesla's autopilot at least long enough to actually do their homework, look into these issues and drive toward a solution that protects the public, makes sure the right things are happening at these companies. They need to due their due diligence, go talk to actual experts in ALL of these areas and not be so wowed by Mr. Musk. That fox owns the hen house and is going to get those hens killed. Musk's mantra that he is statistically saving lives is not only wrong but his system is putting the public in danger.

The Solution

  • Create a Scenario Matrix that cars will be officially tested to. Ensure this matrix covers a minimum amount of scenarios that ensure driver and public safety. Gather folks from these companies, automakers, the insurance industry, traffic engineering, NHTSA, academics and people who actually know how to create, design and test to a massive exception handling matrix like this. Most likely from DoD, NASA or Boeing. Ensure these standards are met before releasing any updates.
  • Bring that systems engineering experience into these companies. Commercial IT has never used most best engineering practices. Yeah I know they make tons of money and really cool apps, games and websites. The fact is that Commercial IT rarely even looks into exception handling (cases where things do not go as planned) let alone a massive effort like this. That includes identifying them, designing to them and testing them. They lack the experience in doing this and their tools don't support it.
  • Stop this massively avoidable process of using customers and the public as Guinea pigs. Musk says he needs 6 BILLION miles of it to collect the data he needs. Look at what that means. Innocent and trusting people being used to not only gather the first sets of data, most of which is for ACCIDENTS, then they are used to regression test after a system change. The reason for the 6 BILLION miles is that most of the data collected is repeat. They have to drive billions of miles because they are randomly stumbling on the scenarios. The solution here is to use the matrix described above with simulation and simulators to do most of the discovery and testing. That can be augmented with test tracks and controlled public driving. (Note - By Guinea pigs I mean the folks driving cars with autopilots engaged. Gathering data when they are in control is prudent.
  • Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so.
  • Investigate the McCarth/Speckman crash. Determine if that car contributed to the accident. That includes any autopilot use as well as why that battery exploded and caused so much damage so fast. https://www.linkedin.com/pulse/how-much-responsibility-does-tesla-have-tragedy-michael-dekort
I am a former systems engineer, program and engineering manager for Lockheed Martin. There I worked on aircraft simulation, the Aegis Weapon System and was Software Engineering Manager for all of NORAD. I was also the whistleblower who raised the Deepwater Program issues - IEEE Xplore Full-Text PDF:
A couple things.

1) I'm not sure if you are aware, but AP2 uses different hardware and software than AP1 (other than perhaps some shared libraries). Specifically, the Mobileye hardware/software used in AP1 was dropped after a public break between Tesla and Mobileye. Now Tesla is using Nvidia's PX2 and their own custom software. Thus there is no "regression" to speak of given it's using new software.

2) Although Tesla's goal is to eventually reach full self driving, AP2 currently remains a purely level 2 assistive feature. They reinforce this in the manual, a warning message to activate the features, and nags for the driver to keep their hands on the steering wheel. These measures were deemed sufficient by the NHTSA (see next point).

3) I encourage you to read NHTSA's full report on the Florida accident before making claims that they and Tesla did not do their due diligence. It's not too long a read and it deals with AP1, but a lot of the same principles apply to AP2. The main concern NHTSA had is if Tesla did enough to address driver attention and driver misuse when using the system. They found that Tesla had taken that into consideration and the mitigating strategies was to NHTSA satisfaction (including the increased hands-on-steering-wheel nagging, which transfers over to AP2). Tesla even went beyond and implemented a strike-out system for drivers who continually ignore nags.
https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF

4) This was already pointed out by others, but other similar systems (even the latest state-of-the-art Mercedes one) will also veer into oncoming lanes when left to its own devices (see quote below and linked article for details). This is not considered a safety issue by the NHTSA because as an assistive feature, it is expected for the driver to intervene in such situations.
obj.phpi

The Mercedes lets go of the yellow lane marking [crossing into oncoming traffic lane], and we have to take control because the car makes no attempt to do so. The Tesla stayed close to the yellow lane marker into the turn, but handles the situation on its own. (Foto:Jamieson Pothecary / Autofil)

obj.phpi

The Mercedes slides over the white lane markings and makes no attempt to fix the situation. There are plenty of lane markings available. The Tesla easily stays within the lane. (Foto:Jamieson Pothecary / Autofil)

obj.phpi

The Mercedes yet again looses the lane and slides into oncoming traffic. The Tesla system is so trustworthy that we miss it immediately after returning the car. (Foto:Jamieson Pothecary / Autofil)

obj.phpi

We come in fast. In this turn we let the Mercedes do what it wanted. It switched lanes and stayed in the lane for oncoming traffic. The Tesla is a little surprised, but pulls in and reduces speed. We do not need to take a grip at the steering wheel. (Foto:Jamieson Pothecary / Autofil)
Hands off
 
Last edited:
http://www.thedrive.com/news/7915/watch-this-tesla-autopilot-2-0-fail-terribly-in-a-model-s

That video shows the car making a quick move across the oncoming lane. It is beyond clear that Tesla's design and testing approach is reckless. Imagine if the car was going faster. How does this not get caught in simulation or using simulators? How does this not get caught on test tracks? Using your customers as Guinea pigs is bad enough but now you are using them to check for massive system regression? This video clearly shows that these cars regressed so far that Tesla's entire process needs to be investigated. Especially around regression testing.

This video is of a clearly labeled BETA version of AP running on AP 2.0 hardware. The issue here appears to be when taking the tight turns, the yellow line falls out of the headlights and the car is unable to see them. This is a combination of using autopilot for a purpose that it's not intended (sharp turns and winding roads) combined with using it at night, and using it on a vehicle with the AP 2.0 hardware.

NHTSA needs to quickly reverse their stance on Tesla's autopilot at least long enough to actually do their homework, look into these issues and drive toward a solution that protects the public, makes sure the right things are happening at these companies. They need to due their due diligence, go talk to actual experts in ALL of these areas and not be so wowed by Mr. Musk. That fox owns the hen house and is going to get those hens killed. Musk's mantra that he is statistically saving lives is not only wrong but his system is putting the public in danger.

The Solution

  • Create a Scenario Matrix that cars will be officially tested to. Ensure this matrix covers a minimum amount of scenarios that ensure driver and public safety. Gather folks from these companies, automakers, the insurance industry, traffic engineering, NHTSA, academics and people who actually know how to create, design and test to a massive exception handling matrix like this. Most likely from DoD, NASA or Boeing. Ensure these standards are met before releasing any updates.
    Testing for something like this on a closed course is extremely difficult. There are hundreds of thousands of variables in the real world, roads that aren't to spec, and drivers that don't behave as expected. Testing this in a simulation is extremely difficult, to the point that it would take years, if not decades, to successfully test.
  • Bring that systems engineering experience into these companies. Commercial IT has never used most best engineering practices. Yeah I know they make tons of money and really cool apps, games and websites. The fact is that Commercial IT rarely even looks into exception handling (cases where things do not go as planned) let alone a massive effort like this. That includes identifying them, designing to them and testing them. They lack the experience in doing this and their tools don't support it.
    Tesla is not an IT company. They are an energy company, and they have a fantastic team working on this project. What you're seeing right now, this "regression," is due to new hardware being used, which will in the long run make AP much much better. AP1 cars are still driving just as well (if not better) than they always have, and AP2 cars will meet and exceed that capability over time.
  • Stop this massively avoidable process of using customers and the public as Guinea pigs. Musk says he needs 6 BILLION miles of it to collect the data he needs. Look at what that means. Innocent and trusting people being used to not only gather the first sets of data, most of which is for ACCIDENTS, then they are used to regression test after a system change. The reason for the 6 BILLION miles is that most of the data collected is repeat. They have to drive billions of miles because they are randomly stumbling on the scenarios. The solution here is to use the matrix described above with simulation and simulators to do most of the discovery and testing. That can be augmented with test tracks and controlled public driving. (Note - By Guinea pigs I mean the folks driving cars with autopilots engaged. Gathering data when they are in control is prudent.)
    The majority of AP data collection happens from vehicles that are in "shadow" mode... They learn from the people driving them, and when the action performed by the driver is different than what AP would have done, a note is made and Tesla then reviews it to decide which was right. This is easily the best and fastest way to do this. No one is saying "You must use autopilot for it to get better" and it is entirely up to the driver on if they want to participate in it. There are multiple warnings that must be agreed to to even enable AP in a Tesla.
  • Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so
    I agree with this; Data collection is neccessary and needs to be done in such a way that the data cannot possibly be lost somehow.
  • Investigate the McCarth/Speckman crash. Determine if that car contributed to the accident. That includes any autopilot use as well as why that battery exploded and caused so much damage so fast. https://www.linkedin.com/pulse/how-much-responsibility-does-tesla-have-tragedy-michael-dekort
I am a former systems engineer, program and engineering manager for Lockheed Martin. There I worked on aircraft simulation, the Aegis Weapon System and was Software Engineering Manager for all of NORAD. I was also the whistleblower who raised the Deepwater Program issues - IEEE Xplore Full-Text PDF: