Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla needs a better software test plan

This site may earn commission on affiliate links.
Back in my medical software development days, we spent quite a bit of time developing and evolving "test plans." These were a written testing procedure we would use with every update, specifying a series of actions we would do to be sure that each update not only did what it was supposed to do, but didn't break anything else.

Based on my own experience and perusing the sequence of issues people repeatedly have with many of the X automated features, the company is failing to do this adequately. It seems that almost every time one problem is fixed, another function becomes broken. Classic "whack a mole." Case in point is the behavior of the easy entry to rear seats. I've had it work perfectly at times but with various updates it breaks again, and breaks in some unique new way! I'd rather have less frequent updates and more testing of each.

The problem is that as software matures, it often becomes more complex, and a rigorous test plan becomes that much more critical.

That said, this seems to have impact on doors and seats and other peripheral functions. The core of the car, the drive train, autopilot, etc, has been rock solid. Maybe the door/seat team should spend a little more time with the autopilot team.
 
It seems that almost every time one problem is fixed, another function becomes broken. Classic "whack a mole." Case in point is the behavior of the easy entry to rear seats. I've had it work perfectly at times but with various updates it breaks again, and breaks in some unique new way! I'd rather have less frequent updates and more testing of each.
You are so dating yourself. Nowadays "wack a mole" (aka Agile) is de rigueur.:)
 
I'm a software engineer in the auto industry at one of the top tier 1 OEM suppliers, so I'm a little familiar with the coding standards and certifications (auto SPICE, autosar, MISRA) we have to adhere to. That being said, any programmer knows that there will always be edge cases you can't unit or sim test out. You can throw as much CC at the code as you want, but you have so many different body control modules, ecus, and gateways, some errant behavior is bound to get through. As big a deal as these user bugs are, think of all the things that work well. That's why I believe we're seeing the edge case bugs.
 
  • Informative
Reactions: GSP
Agile may be de rigueur, but Tesla doesn't seem to be using it. I agree with PGeer, as a (former) IT Project Manager we lived & died by the strength of our test plans.
Progression test data wins. As important as any tool. Takes a little more time up front and saves mountains of time and damage in the future. I am a dinosaur and work with contractors that used fresh meat and our workplace as it seemed a clearing house for them. Thankless task teaching those pups and little thanks.
 
I agree with Dazureus. With automotive control programming it is impossible to achieve software quality by "testing quality into the code." No matter how good your test plan is there is always some "7 microsecond window" where an event will lead the software down the wrong path.

Quality has to start with carefully designed code, with thorough analysis of all proposed changes for what undesirable interactions that they might cause.

Once that is done, I do also agree with PGeer and FarmerDave: rigorous test plans, including interaction testing for every change, and regression testing for release candidates, is a must.

Best wishes to the Tesla software teams as they "live the dream" to do all this, and also aggressively take on new features and improvements that their customers will love!

GSP
 
  • Like
Reactions: GlmnAlyAirCar