Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla's Autopilot needs to be shut down and NHTSA needs to do their due diligence

This site may earn commission on affiliate links.
You have no idea how bad AP2 is since you have never used it and therefore you clearly don't know what you are talking about. How could you?
I'm not claiming that I have any more information than you, but you are trying to make it seem like everything is fine with AP2 when everybody who has it and who posts here is saying that it is not even close to OK.
Again, you're reading too far between the words, and not reading what I actually said.

What I did say was that AP1 behaved drunk in the early days too. What I did imply was that when used with 2 lane markings, it's pretty decent; just like AP1 was in early days. In the context it was a comparison of AP1 when it came out and people using it in the city (where it's not meant to be used) to people using AP2 in places where it's meant to be used but does poorly.

What I didn't say is that AP2 is "OK". I actually never rated AP2, as that would require several hundred of miles of me driving it, which I have not driven AP2 for several hundreds of miles on various roads.


And this was all in context [based on your quote] with using AP in an undivided road, which may have crests, sharp curves, oncoming traffic, faded lane markings, lane markings only on one side, etc. Not on the highway, where AP is supposed to be most useful.


If you want to pick an argument, go somewhere else. I'm done replying to you on this topic.
 
Last edited:
  • Like
Reactions: JohnnyG
I have AP2 and I find that it does a pretty OK job on the highway (currently <50mph). It, unlike AP1, IS intended for "local roads" and that's where it's performance is currently troubling. I still use it but in a very careful manner. Basically it does a great job when the road goes straight. It can even handle unmarked roads as long as there is a curb. I don't like that it will cross double yellow lines and that it will hunt Lane lines especially when first engaged but it's not inherently dangerous. You can't tell a knife company to stop selling knives because they might result in injury. Tesla can and should do a better job with AP2 but OP misses Tesla's true defects which is in communication and honesty.
 
  • Helpful
Reactions: bak_phy and Max*
http://www.thedrive.com/news/7915/watch-this-tesla-autopilot-2-0-fail-terribly-in-a-model-s

That video shows the car making a quick move across the oncoming lane. It is beyond clear that Tesla's design and testing approach is reckless. Imagine if the car was going faster. How does this not get caught in simulation or using simulators? How does this not get caught on test tracks? Using your customers as Guinea pigs is bad enough but now you are using them to check for massive system regression? This video clearly shows that these cars regressed so far that Tesla's entire process needs to be investigated. Especially around regression testing.

NHTSA needs to quickly reverse their stance on Tesla's autopilot at least long enough to actually do their homework, look into these issues and drive toward a solution that protects the public, makes sure the right things are happening at these companies. They need to due their due diligence, go talk to actual experts in ALL of these areas and not be so wowed by Mr. Musk. That fox owns the hen house and is going to get those hens killed. Musk's mantra that he is statistically saving lives is not only wrong but his system is putting the public in danger.

The Solution

  • Create a Scenario Matrix that cars will be officially tested to. Ensure this matrix covers a minimum amount of scenarios that ensure driver and public safety. Gather folks from these companies, automakers, the insurance industry, traffic engineering, NHTSA, academics and people who actually know how to create, design and test to a massive exception handling matrix like this. Most likely from DoD, NASA or Boeing. Ensure these standards are met before releasing any updates.
  • Bring that systems engineering experience into these companies. Commercial IT has never used most best engineering practices. Yeah I know they make tons of money and really cool apps, games and websites. The fact is that Commercial IT rarely even looks into exception handling (cases where things do not go as planned) let alone a massive effort like this. That includes identifying them, designing to them and testing them. They lack the experience in doing this and their tools don't support it.
  • Stop this massively avoidable process of using customers and the public as Guinea pigs. Musk says he needs 6 BILLION miles of it to collect the data he needs. Look at what that means. Innocent and trusting people being used to not only gather the first sets of data, most of which is for ACCIDENTS, then they are used to regression test after a system change. The reason for the 6 BILLION miles is that most of the data collected is repeat. They have to drive billions of miles because they are randomly stumbling on the scenarios. The solution here is to use the matrix described above with simulation and simulators to do most of the discovery and testing. That can be augmented with test tracks and controlled public driving. (Note - By Guinea pigs I mean the folks driving cars with autopilots engaged. Gathering data when they are in control is prudent.
  • Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so.
  • Investigate the McCarth/Speckman crash. Determine if that car contributed to the accident. That includes any autopilot use as well as why that battery exploded and caused so much damage so fast. https://www.linkedin.com/pulse/how-much-responsibility-does-tesla-have-tragedy-michael-dekort
I am a former systems engineer, program and engineering manager for Lockheed Martin. There I worked on aircraft simulation, the Aegis Weapon System and was Software Engineering Manager for all of NORAD. I was also the whistleblower who raised the Deepwater Program issues - IEEE Xplore Full-Text PDF:

This post is amazingly ignorant of lots of relevant facts already posted in many threads on this forum such as the disclosure of current AP2 status, and projections for future updates and functionality.

Mr. DeKort -- do your own due diligence and inform yourself. You can start by test driving a Tesla and experiencing for yourself what you complain of.

You can also read the NHTSA report on AP1 and review how AP1 was rolled out incrementally as well.

In other words, do not write such outlandish claims while still ignorant of such basic information.

And since you aren't even a Tesla owner, disclose whether you have any conflict of interest in any capacity. Such as a paid shill for a short-seller -- who is suffering mightily about now at 260 and might be interested in having someone pretend to be qualified and post ignorant ramblings.

More informed discussion of AP2, its progress and lack of thereof, is contained in other threads.
 
I would think that would be one of the easier things to program if not the easiest and that's what's worrying.

This is the first iteration. It will improve quickly. I'm not concerned I just adjust how I use it as I establish a baseline for using that feature. Since initial reviews were poor I was very cautious and ended my initial testing pleasantly surprised. I am very selective in the roads where I use autosteer but I find what it does well it will do regardless of conditions (time of day or weather). Just used it in both rain and snow in one day thanks to Chicago weather. Worked consistently at least. Just don't use it on turns or big intersections or where lanes divide or merge or where people speed or where bicyclists or pedestrians are around (though my hw2 recognizes cyclists as cars it is not good enough to trust it). Despite the laundry list of restrictions I used local autosteer for five miles each way on my commute each day with incident since I got the update. It is useful and makes everyone safer and it will only get better. No need to be an alarmist once you understand and appreciate what it is and isn't currently and adjust that as it improves.
 
  • Informative
Reactions: bak_phy
To the OP: Thank you for your seemingly honest concern for my and others safety WRT AP2.

As an intelligent person, even if I had not read the entire manual for one specific aspect on the proper operation of the vehicle, there's this little thing called personal responsibility. Whoever made the video, which I've not wasted my time watching based on your characterization of it, allowed the car to drive the way it did.

Why they allowed it seems to be to make a point that they were reckless in that they did not take control of the situation. As a licensed driver, it's your responsibility to maintain control of your vehicle at all times. They did not.

Your choosing to point to their reckless antic speaks volumes as to your intent here...

Thanks for playing. You may put your whistle away now and move on; There's nothing to see here.
 
I'll just chime in with a quick note. I've utilized AP1 since day 1 of public release and have driven tens of thousands of miles with AP1 since that time. It's not perfect, but it's mostly predictable and I generally know where it's going to have trouble.

I recently did ~100 miles with AP2 in an X... And you know what? IMO, this should NOT be publicly utilized nor have even been released yet. Seriously. AP2 was all over the place and completely unpredictable on routes where I've used AP1 without incident over a hundred times. AP2 was nearly unusable. The initial release of AP1 was 20x more reliable, despite AP2 having a significant sensor advantage.

I don't know who in their right mind thought it was a good idea to start rolling out AP2 hardware before they even had basic feature parity with AP1, but those involved should be sacked.
 
While I sort of see the OP's sentiment about "using Tesla owners as test subjects" as being questioned, I don't think its as bad as you make it out to be. I do think learning from the real world is A LOT better than testing in a lab. As with any software, there are staged/phased releases that come out with limitations and caveats and the users report customer found defects and issues. Yes there is life as the cost here, but only if used recklessly without being responsible.

You need to have your hands and feet alert when you enable Auto-steer and only enable it when you are confident it can handle the scenario. Just one trial should give you a sense of confidence level on the streets you are attempting it on and the user is expected to "beta test" it per se with utmost care and responsibility.

When AP2 goes primetime, and you see an issue with AP2 allowing you to enable auto-steer and stay engaged when it is dangerous, then there is reason to question the quality of the product, but even at that point it is the individuals responsibility to gauge the quality of the product and use it accordingly and when they feel confident that it can be productive and not counter-productive.

At this stage I for one like that all the users are actively trying it out in the real world and providing real data as opposed to simulations and tests and to me that will ensure a more solid product when it is released as opposed to them saying AP2 is ready to go we have run million miles of "simulation" on it.

As with anything that has a danger element associated with it "Use with caution" applies here as well. It can be mitigated by being responsible, careful and attentive.

I'd like to add that although I made the statements above, I sort of agree with the notion that AP2 auto-steer is terrible at this point and it shouldve been tested more internally before using live subjects, there is a fine line to draw here between releasing a beta product vs releasing something really not stable and expecting to learn from "incidents" as opposed to safe miles. Good discussion overall
 
Is it me or am I the only one that doesn't really care that AP2 doesn't work yet. Ill tell you what I miss, rain sensing wipers. I spent 20 minutes in a light mist wondering why my wipers where being stupid before I remembered that "feature" was not working yet!

I guess I am the stupid one...
 
You lost me at 'we should get the DoD involved'...

Can someone more knowledgeable than I verify that the software can or was running in the background and learning prior to being active for the end user to deploy? That would put a massive hole in the OPs 6bn miles argument.
 
  • Like
Reactions: Red Sage
In some of the fail videos the car crosses into oncoming traffic very rapidly. Given that you won't override the car until you see that it's screwing up even if you are paying attention there's a worryingly high chance that you would nick the car in the other lane or cause them to swerve.

Completely disagree. If you keep your hands on the wheel, like you're supposed to, you can feel the wheel beginning to move, and if you're paying attention, like you're supposed to, you will know that the wheel is moving in a way it's not supposed to and can immediately correct it. What idiot waits to see the car cross the lane into opposing traffic before doing something about it?