Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla's Autopilot needs to be shut down and NHTSA needs to do their due diligence

This site may earn commission on affiliate links.

That video shows the car making a quick move across the oncoming lane. It is beyond clear that Tesla's design and testing approach is reckless. Imagine if the car was going faster. How does this not get caught in simulation or using simulators? How does this not get caught on test tracks? Using your customers as Guinea pigs is bad enough but now you are using them to check for massive system regression? This video clearly shows that these cars regressed so far that Tesla's entire process needs to be investigated. Especially around regression testing.

NHTSA needs to quickly reverse their stance on Tesla's autopilot at least long enough to actually do their homework, look into these issues and drive toward a solution that protects the public, makes sure the right things are happening at these companies. They need to due their due diligence, go talk to actual experts in ALL of these areas and not be so wowed by Mr. Musk. That fox owns the hen house and is going to get those hens killed. Musk's mantra that he is statistically saving lives is not only wrong but his system is putting the public in danger.

The Solution

  • Create a Scenario Matrix that cars will be officially tested to. Ensure this matrix covers a minimum amount of scenarios that ensure driver and public safety. Gather folks from these companies, automakers, the insurance industry, traffic engineering, NHTSA, academics and people who actually know how to create, design and test to a massive exception handling matrix like this. Most likely from DoD, NASA or Boeing. Ensure these standards are met before releasing any updates.
  • Bring that systems engineering experience into these companies. Commercial IT has never used most best engineering practices. Yeah I know they make tons of money and really cool apps, games and websites. The fact is that Commercial IT rarely even looks into exception handling (cases where things do not go as planned) let alone a massive effort like this. That includes identifying them, designing to them and testing them. They lack the experience in doing this and their tools don't support it.
  • Stop this massively avoidable process of using customers and the public as Guinea pigs. Musk says he needs 6 BILLION miles of it to collect the data he needs. Look at what that means. Innocent and trusting people being used to not only gather the first sets of data, most of which is for ACCIDENTS, then they are used to regression test after a system change. The reason for the 6 BILLION miles is that most of the data collected is repeat. They have to drive billions of miles because they are randomly stumbling on the scenarios. The solution here is to use the matrix described above with simulation and simulators to do most of the discovery and testing. That can be augmented with test tracks and controlled public driving. (Note - By Guinea pigs I mean the folks driving cars with autopilots engaged. Gathering data when they are in control is prudent.
  • Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so.
  • Investigate the McCarth/Speckman crash. Determine if that car contributed to the accident. That includes any autopilot use as well as why that battery exploded and caused so much damage so fast. https://www.linkedin.com/pulse/how-much-responsibility-does-tesla-have-tragedy-michael-dekort
I am a former systems engineer, program and engineering manager for Lockheed Martin. There I worked on aircraft simulation, the Aegis Weapon System and was Software Engineering Manager for all of NORAD. I was also the whistleblower who raised the Deepwater Program issues - IEEE Xplore Full-Text PDF:
 
Last edited by a moderator:
As long as human drivers cause -absolutely and relatively- more accidents, I don't think it is in the interest of humans to shut down autopilot.
My point is that it needs to be shutdown now due to the massive regression the latest version caused.HAve you seen the video? Something is very, very wrong - especially in regression testing protocol. If it comes back up Tesla nor anyone else should be using human Guinea pigs to drive 6 billion miles in autopilot to gather data. That wastes time and lives. As I said there is a far better way that other industries have been using for decades. Using data collected from folks driving, not using autopilot. is great and should be gathered. When a minimum set of scenarios are proven safe to use autopilot should be gradually reintroduced.
 
OP: Agreed on most of what you said. But why are you not vocal and clamoring for the same treatment to shut down for the other driver-assisted lane keeping from Mercedes and Volvo and such?
Thank you for the question. I have commented in many places. I am not favoring anyone. If you look at my articles on LinkedIn you will see most of the text is aimed at the industry as a whole. I do however realize I post mostly on Tesla. The reason for that is mostly due to press coverage and how egregious I believe Tesla's approach to be. That includes Elon Musk's video where he clearly takes his hands off the wheel and acts exactly how he says his customers should not act in his cars. I assure you that if and when articles etc come up on those other companies I will post.
 
Very interesting thread and I can't agree or disagree with OP.

Having driven thousands of miles on Highways using AP1 in many cars I feel the technology is great, helpful and to an extent, very safe and can potentially be better than a human driver.

Having driven some miles on local roads (just like in the video, where Tesla doesn't even suggest you should use it), I wouldn't trust AP1 to keep me in a clearly visible lane, even with 2 lane markings, even with the latest software version.

I feel that under certain situations AP1 or AP2 can be reasonably safe by making sure you don't fully trust it and are ALWAYS ready to take over any time. But having said that, i also feel that a new owner of an AP2 car might have heard only the good side of AP and they just enable AP2 to show their friends how cool their brand new car is and accidentally kill somebody in the process. Perhaps the solution would be to disable Autosteer on non-highway roads for now.
 
  • Like
Reactions: emorog
keep at least one hand on the wheel and pay attention to the road like the manual says and it is safer than any other method of driving.

your outrage is 100% due to you failing to RTFM. sorry excuse for a systems engineer if you ask me.

But can you expect every (and not being racist or nationalist) Chinese, Indian, Malaysian, etc owners to read the entire manual...? I know you can expect a fighter pilot to know every knob on their plane but can you expect the same with the driver (and not necessarily the owner of the car). Can you expect a person who rents a Tesla on Turo to have the same mindset about Autopilot than the owner? This is not only about owners and fans of the technology...
 
If you don't like AP... don't use it. :)
It's not that simple. I believe most folks are being lulled into a false confidence by Elon Musk. Watch his video where he takes his hands off the wheel, looks all around, goofs around with the passengers etc. Then he says he won't change the name from Autopilot to driver assist. The public is making an assumption the car is safe enough for them to be used as Guinea pigs to gather what they think is the final bits of data. Elon himself said he needs the public to put up with 6 BILLION miles of this. Look at that "Drunk Tesla" video. The system is nowhere near ready for the public to be Guinea pigs in that updated version. it regressed so much someone needs to take a hard look at what is going on. I guaranty you that if the public saw the scenario matrix and the massive gaps in it this would not be happening. And unfortunately NHTSA is deferring to Elon. I believe this is because they have no exposure or experience in to other industries so they buy in to the mantra that this is the best way to do it. It is not. Not even close.

The reason 6 billion miles needs driven is that most of it is repeat data. To get unique scenario data, accident data, in a method that requires folks to stumble on it, that many miles is needed. If the public driving around to gather data. ACCIDENT data, is the best method and Tesla needs 6 billion miles of it, what happens with upgrades or changes? Redo 6 billion or so miles to re-stumble on the regression scenarios?

Again I am for this technology and realize nothing is perfect. However there is a far better way that is actually much faster and puts far fewer lives at risk.
 
Y
keep at least one hand on the wheel and pay attention to the road like the manual says and it is safer than any other method of driving.

your outrage is 100% due to you failing to RTFM. sorry excuse for a systems engineer if you ask me.

You are not actually addressing my points. Shooting the messenger is not an objective nor helpful response.

Have you seen the "Drunk Tesla" video? Tell me how that is a safer method of driving? That car darts across the other lane.(Keep in mind the car is going under 30mph)

Finally my point was there is a better way. Why not use it?
 
Knee jerk reactions from one day members is not what we need here.
I came here to try to be helpful. If my approach was safer and faster shouldn't that matter? If the current process is counter-productive how is that helpful? Wouldn't it be more productive to directly address my points rather than shoot the messenger etc?How is my suggested approach not the better one? If folks really care about the technology and people's well being wouldn't the fastest and safest method be in everyone's best interest? I am not an autopilot hater.

(Note- When SpaceX had its first code base reviewed by NASA they rejected it. There were no defects, which is not possible if something that large and complex was properly tested, and there was poor exception handling. Same thing here)
 
Knee jerk reactions from one day members is not what we need here.

Hmm.. A 1000 posts doesn't make anyone right and with that assumption, one day members can have very useful comments or thoughts.

The knee jerk part is a different topic. How do you know it's a knee jerk reaction or OP has been contemplating for months, maybe he owns a Tesla with AP?

Bear in mind, i'm not saying he is right or wrong. But shooting down somebody's opinion this way is not constructive in my opinion.
 
  • Like
Reactions: emchen and JetKit
I came here to try to be helpful. If my approach was safer and faster shouldn't that matter? If the current process is counter-productive how is that helpful? Wouldn't it be more productive to directly address my points rather than shoot the messenger etc?How is my suggested approach not the better one? If folks really care about the technology and people's well being wouldn't the fastest and safest method be in everyone's best interest? I am not an autopilot hater.

(Note- When SpaceX had its first code base reviewed by NASA they rejected it. There were no defects, which is not possible if something that large and complex was properly tested, and there was poor exception handling. Same thing here)

Interesting points. But how is the fact the the technology has already saved drivers from many accidents? How many fatalities did it prevent - I'm not sure. Did it directly cause more or less fatalities than it saved..? But I see the point that even though it is not a fully functioning feature, in certain situations it can be much better than a human driver..
 
  • Like
Reactions: WarpedOne
I guess what I'm trying to say in my mind is TACC (traffic aware cruise control) is very solid, safes drivers already and should be encouraged, used and where possible, improved.

Autosteer should probably be restricted to highway roads for now.

How do you know TACC is fine? In the Molthan accident the car took off after the airbags went off. While that may not be a TACC issue and be a system command and control issue the symptom shows up on the TACC side.

Tesla owner who crashed on Autopilot has a warning for other drivers
 
But can you expect every (and not being racist or nationalist) Chinese, Indian, Malaysian, etc owners to read the entire manual...?
Of course Tesla can expect every owner -- wherever they are located and whatever nationality they are -- to read the manual, and when Tesla sells a car they should advise that. I am sure that in the countries that Tesla sells in the manual is written in the national language.
 
Interesting points. But how is the fact the the technology has already saved drivers from many accidents? How many fatalities did it prevent - I'm not sure. Did it directly cause more or less fatalities than it saved..? But I see the point that even though it is not a fully functioning feature, in certain situations it can be much better than a human driver..
Have you seen the "Drunk tesla" video? That new version is not a net positive.

Also my point is that lives are needlessly being pout at risk by the process. Change that at all is well. If I cure cancer but through the process give you lesser contracted diseases that kill you , and it was ALL AVOIDABLE, have I done my due diligence? If a family member died and you learned it was avoidable would you feel good about them being sacrificed for the greater good?
 
  • Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so.
Hang on a second - I get the whole not trusting autonomous driving and all that, but justifying a your point by using an accident for which there was no data or evidence linking it to autonomous driving is plain bad science and poor analysis.
On that basis, every car manufacturer should closely examine every single accident involving their cars just in case the driver might have engaged something in the car that caused the accident.
The fire is a completely unconnected with your calls for disabling driver assistance features.
That just makes your arguments sound like the guy who tries to prove issues with wheels falling off by using salvage pics in bogus safety complaints.