That video shows the car making a quick move across the oncoming lane. It is beyond clear that Tesla's design and testing approach is reckless. Imagine if the car was going faster. How does this not get caught in simulation or using simulators? How does this not get caught on test tracks? Using your customers as Guinea pigs is bad enough but now you are using them to check for massive system regression? This video clearly shows that these cars regressed so far that Tesla's entire process needs to be investigated. Especially around regression testing.
NHTSA needs to quickly reverse their stance on Tesla's autopilot at least long enough to actually do their homework, look into these issues and drive toward a solution that protects the public, makes sure the right things are happening at these companies. They need to due their due diligence, go talk to actual experts in ALL of these areas and not be so wowed by Mr. Musk. That fox owns the hen house and is going to get those hens killed. Musk's mantra that he is statistically saving lives is not only wrong but his system is putting the public in danger.
The Solution
- Create a Scenario Matrix that cars will be officially tested to. Ensure this matrix covers a minimum amount of scenarios that ensure driver and public safety. Gather folks from these companies, automakers, the insurance industry, traffic engineering, NHTSA, academics and people who actually know how to create, design and test to a massive exception handling matrix like this. Most likely from DoD, NASA or Boeing. Ensure these standards are met before releasing any updates.
- Bring that systems engineering experience into these companies. Commercial IT has never used most best engineering practices. Yeah I know they make tons of money and really cool apps, games and websites. The fact is that Commercial IT rarely even looks into exception handling (cases where things do not go as planned) let alone a massive effort like this. That includes identifying them, designing to them and testing them. They lack the experience in doing this and their tools don't support it.
- Stop this massively avoidable process of using customers and the public as Guinea pigs. Musk says he needs 6 BILLION miles of it to collect the data he needs. Look at what that means. Innocent and trusting people being used to not only gather the first sets of data, most of which is for ACCIDENTS, then they are used to regression test after a system change. The reason for the 6 BILLION miles is that most of the data collected is repeat. They have to drive billions of miles because they are randomly stumbling on the scenarios. The solution here is to use the matrix described above with simulation and simulators to do most of the discovery and testing. That can be augmented with test tracks and controlled public driving. (Note - By Guinea pigs I mean the folks driving cars with autopilots engaged. Gathering data when they are in control is prudent.
- Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so.
- Investigate the McCarth/Speckman crash. Determine if that car contributed to the accident. That includes any autopilot use as well as why that battery exploded and caused so much damage so fast. https://www.linkedin.com/pulse/how-much-responsibility-does-tesla-have-tragedy-michael-dekort
Last edited by a moderator: