Mr H
Active Member
Tesla should allow other cars to charge just as IONITY do, but they should be billed an extortionate price too - No Tesla Sir, that will be £2 per kwh then.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I agree. If it weren't for the pathetic state of public fast chargers, I'd certainly be looking at other marques if I were in the market.This could be as game changer for my next car - may well choose another brand if the supercharger network was freely available because much as I love the cars, I cannot approve of the cavalier way so many customers are treated and the erratic pricing model.
Without one of Teslas' main USPs there won't be such a compelling reason to remain loyal.
Whilst I wholeheartedly agree with your sentiment (and have voiced similar opinions in the past), in all fairness it doesn’t appear that any software update has ever broken any of the safety-of-life systems so it may well be that there is a core software that is subjected to a much more robust QA process.Things like this don't do much for Tesla's reputation when it comes to software. Not as if this is a one-off, as we know that there was a software bug that made all Model 3s non-compliant with IEC61851 (the charge point international technical standard) for months, in direct contradiction with the approval certification Tesla claimed to have. I think most of us have probably seen evidence of software updates that have broken things that previously worked OK, too.
The circumstantial evidence suggests that Tesla's software QA is pretty poor, and as a lot of their software is safety-critical, that does leave a bit of a question mark as to how trustworthy any of it is. The "move fast and break things" approach shortens the development time, but if that approach isn't combined with a robust QA system, then it can be prone to letting bugs get out into released code. Some of the software failings we've seen indicate that software is barely given any form of rudimentary testing. Both the Model 3 non-compliance with IEC62851 and the Supercharger bug were very obviously defects that could have been picked up with a rudimentary functional test. Heck, the Model 3 charging bug could be found in 30 seconds by any electrician with the ability to simulate all the control pilot states. Dead easy to do with just the UMC plugged into a 13 A outlet, no test equipment was needed at all.
Whilst I wholeheartedly agree with your sentiment (and have voiced similar opinions in the past), in all fairness it doesn’t appear that any software update has ever broken any of the safety-of-life systems so it may well be that there is a core software that is subjected to a much more robust QA process.
That would make sense. Breaking charging is inconvenient but not A Huge Deal (never made it to the papers). A software update that would break steering or braking would certainly be all over the press and cause incalculable reputational damage.
Apple has massive issues. The days of “it just works” are well and truly gone.
So you work for Boeing by any chanceI hope you're right, although that suggests that Tesla have two separate software QA systems, one for safety critical code and one for non-safety critical code. In turn that also suggests that someone, or some team, has to determine which software QA approach is used for any particular code branch, and that itself isn't a particularly robust approach, IMHO.
I've experience of managing an aircraft procurement a few years ago, where code had been developed, without the team doing the work fully realising that the code they were working on was flight safety critical. It was code for an engine control system, a FADEC, for an engine that had been developed for a wide range of uses. The code had been written in C, and compiled on a popular commercial compiler. When we (as in the UK) wanted to independently verify the code, so that it could be certified as safe to fly, it was discovered that the compiler used wasn't certified. When a certified compiler was used the code didn't behave as expected. A close look at the source code, with a static code walk through, showed that it was inherently unsafe. Caused a bit of a stir with the team in the US that had written the code, but it remains firmly lodged in my memory of the hidden risks that can exist in complex code (not that the code in question was particularly complex).