Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Open Letter to Elon Musk

This site may earn commission on affiliate links.
I think the OP does understand and is advocating a different approach than basing things on human drivers (OP's reference to "aerospace level simulation"). Seems like a controversial position from the OP, but it doesn't seem to me that he doesn't understand that...

All that said, I did read some sentiment from OP that he is also fearful of Tesla's agile progress on driver's aids and self-driving, which is part of the fleet learning process of course...

I was going to give the OP the benefit of the doubt on that, except that it was followed up in subsequent posts with statements that Tesla's being reckless by allowing people to drive their cars with shadow driving. If we take him at his word on that, and believe that he knows what shadow driving is, then his beef shouldn't just be with Tesla. It should go double for every single other car currently available on the market. All current forms of transportation except, maybe, airplanes should be banned entirely.
 
I agree with the comments above, but my question is what leads the OP to believe simulation is not being used? There seems to be evidence they are using simulation, such as this job posting for an Autopilot Simulation Engineer.

Furthermore, I think it's dangerous NOT to include shadow driving as part of the validation, and the more public the better. Simulation is just that a simulation. Shadow driving is testing the software in the real world...you know, the one where lives actually matter. Just as the OP asserts regarding simulation, more miles of shadow driving mean a safer Autopilot.
 
Last edited:
  • Like
Reactions: calisnow
Although I like Tesla taking a forefront approach in pushing semi-autonomous system so that it can use the real world accurate data from its fleet, this needs to come with an assurance that it will continue to stay active in constantly improving its system to achieve FSD. The way things are right now, AP2 progress seems stagnant and in some areas, even reverted backwards. If you're going to be aggressive in furthering autonomy, this stagnation needs to be avoided at all costs since it risks putting someone in harm's way by leaving them with software that is unreliable.

Good point. What do you think Tesla should be doing to not "stagnate" on the autopilot front that they are not already doing? Do you have any statistical data to show they are stagnating? Do you have some statistical data to show that autopilot driving in its current state is more dangerous than human driving?
 
Tesla, ever since the early AP1 days, has decided to be more agile and do more with less development time, less sensors etc.I mean, it isn't completely unfathomable that this approach is one disaster away from collapse, even though I hope it never comes to that. The state of AP2, while driver is responsible, feels in part downright dangerous.

Exactly. I would argue they are doing less with less at the moment. Sadly, all this nonsense ends when the minivan or bus load of kids goes up in flames, and the entire media looks at NHTSA and screams “wtf!”

Right now they are allowing Tesla to release a half baked AP2 operation into the public thinking it is AP1...assumed ok. After ‘the big one’, they will force us all to give up AP1 thinking it is same as AP2. Pity...this year as a Tesla test pilot has been a great improvement in quality of life. 40k+ Miles, mostly on AP.

This wasn’t the plan, just like every other big mistake they make, they just assumed the best possible outcome with zero contingency. Negligent from a management perspective...but entertaining
 
Good point. What do you think Tesla should be doing to not "stagnate" on the autopilot front that they are not already doing? Do you have any statistical data to show they are stagnating? Do you have some statistical data to show that autopilot driving in its current state is more dangerous than human driving?

I've watched a few videos where the summon feature was not detecting objects and still continuing to proceed towards it. There's the auto park feature which failed to avoid hitting the curb for me. Also, the sudden drastic drop in speed while using autopilot has happened to some people when it loses the markings on the road. There's the missing features such as automatic windshield wipers and on-ramp to off-ramp. These are just some of the things I can think off off the top of my head. At least autopilot ensures the driver is constantly staying alert and interactive but then Tesla reverts backwards by disabling key safety features like AEB in newer models.

I'm still grateful to have AP and do use it briefly but I wish Tesla would push software updates sooner to improve some of these issues.
 
Exactly. I would argue they are doing less with less at the moment. Sadly, all this nonsense ends when the minivan or bus load of kids goes up in flames, and the entire media looks at NHTSA and screams “wtf!”

This claim has been made dozens of times on TMC with no evidence whatsoever to back it up. The only evidence we do have is that there has been at least one confirmed death in the U.S. while using autopilot and yet autopilot continues, development continues, other cars are becoming more automated etc. There is absolutely no evidence to indicate that bad PR from a single highly publicized event will outweigh the millions of accident-free autopilot miles driven in the minds of the driving public or government agencies and somehow set back the development of autonomous cars. If anything the evidence is the opposite - despite hand-wringing and fear-mongering in publications like Consumer Reports - Tesla sales continue to grow and autopilot is doing just fine.

Right now they are allowing Tesla to release a half baked AP2 operation into the public thinking it is AP1...assumed ok. After ‘the big one’, they will force us all to give up AP1 thinking it is same as AP2.

Again - a totally data-free assertion that makes two unsupported claims - a huge PR-disaster accident is coming that will be autopilot's fault AND that this imagined-disaster will be the end of assisted driving.

This wasn’t the plan, just like every other big mistake they make, they just assumed the best possible outcome with zero contingency. Negligent from a management perspective...but entertaining

Again - a claim flying in the face of all available evidence - Tesla's sales growth and its stock price.
 
It's the everyday heroes among us that give me hope.



Never thought about this way - a lot is at stake!



Please stop with the amazing writing. You are hurting my heart - Foster Wallace and Joyce look out.



The right thing is hard - but it's the right thing.



Get out. You mean you write, you're an engineer AND you have ethical credibility? Did I say Joyce needs to watch his back? Scratch that - Leonardo Davinci is the one who should be nervous.




Dude please - please. You've done enough. We need you - we need your mind. Don't put yourself in danger - for us. The best thing you can do for us is to stay safe and keep writing.



Well - DUH! WTF Elon!



There's a punchline here - but it's ELON.



Well damn - that's all the case I need to hear. Get me outta this damn Tesla and somebody find me a Cadillac.



Wait. Wait. What if you - @imispgh - can police Elon? Like - you could be a special rep of TMC who can sit on Tesla's board, perform surprise inspections of Tesla's simulation labs - and then report back to the forum for a feedback roundtable with Elon and the rest of us? This could be a win-win. Don't say I'm crazy until you think it over. PM me - I have some high level contacts inside Tesla who might consider this idea (seriously).



Concise writing is for the weak and ADHD challenged. Give us everything you've got, please.



Did you ever see that Frank Capra movie - "Mr. Imispgh Goes to Washington?" Seriously - you inspire me that way. PM me for my Tesla contacts.




And THIS is why Michael DeKort is a BOSS - he uses his real name. Bravo. Bravo. Bravo.

@calisnow HOLY BALLS I can't stop laughing at your post.
 
Joshua loved technology and was a successful entrepreneur. He developed several database applications widely used by the Navy. In 2010, he started his own technology company, Nexu Innovations. The company primarily focused on developing and installing WIFI and surveillance systems, but also developed other technology driven applications.

This fatal and tragic collision was the first documented crash involving the use of driver assist autopilot technology. ...
Well, a statement from Joshua's family isn't necessarily fact. The first Tesla AP death happened in China 5 months before that :( Who knows, there could be few more that the world will never know.
I also find it very curious that Joshua's family made a statement the day before NTSB came out with their conclusion on the case.

Besides, there were many crashes before these 2 deadly crashes. That part of the statement is wrong.
 
...The first Tesla AP death happened in China 5 months before that :( Who knows, there could be few more that the world will never know...

The Gao Yaning's family is suing Tesla claiming AP1 was active at that time.

Tesla has been working with the family to access the car log but the family has been uncooperative.

Thus, this case continues to be unverifiable and it still can not be confirmed as a first DOCUMENTED Autopilot death.

Joshua's family's statement is true. I can rearrange the words to get the meaning clear:

"first documented fatal autopilot crash"

They are not talking about first injury crash either.
 
Last edited:
Please consider changing your approach in creating autonomous vehicles. People’s lives, possibly the industry, your company and your legacy depend on it.


Elon

When you submitted your SpaceX code to NASA for review I believe they failed it for not being properly tested and for having exception handling issues. NASA intervened, forced you to do the right things and saved SpaceX and you. As NHTSA and others have abdicated their positions and deferred to you and the rest of the AV industry, I do not believe there is anyone with enough power to force you to do the right things and save Tesla or you here. I am making the attempt to help you save yourself as well as the many people who will lose their lives, needlessly, as you pursue an untenable process to build your autonomous vehicles. What is at stake here, in addition to the lives of these innocent people, are your employees, shareholders, most of the AV industry because they follow you, your company and you personally.

I believe you are in the process of crossing the line between being famous to being infamous. From saving lives to needlessly costing lives. From ethical to unethical. From being seen as one of the greatest visionaries, humanists and verbs of all time to someone who will be remembered for squandering all of that and becoming a monumental hypocrite. A person who becomes or does the things they say they rail against. As I have almost zero power here the best I can do is make my case, ask you to take a deep breath, try to put your ego and pride to the side, evaluate the information presented and find the courage to do the right thing. (I say almost zero power because I have done somethings that have earned me some ethical and engineering credibility. Should I not convince you to do the right thing here I will do my best to find another way to force a change.)

The issue is your use of public shadow driving as a primary or significant means of creating your autonomous technology. Versus the use of aerospace level simulation. Setting aside the deplorable and selfish use of your paying customers and their families as your Guinea pigs, let’s focus on the objective viability of that approach from a cost, time and engineering point of view. I will start with the punch line – you will never reach autonomy using this approach. You will never save lives using it. You will also make the lives of those people you have stated are necessary casualties, completely unnecessary casualties. When the public, press, insurance companies and lawmakers figure this out, after the first child or family is killed needlessly in one of your vehicles, they will feel betrayed and rightfully determine you and the majority of the industry, using this approach, are not competent or ethical enough to continue. At least not without far more regulation and delays than you would have had if you self-policed. You will also never reach the end state because you, nor anyone else, can drive the one trillion (not 6B) miles needed to get there nor spend the $300B+ to do so.

In an effort to keep this letter concise please find links two of my articles. In them I explain all of the issues, their root causes and how to resolve them in detail. I also provide links to the references I cite.

Letter to Congress - Handling of minimum standards for Autonomous industry

· https://www.linkedin.com/pulse/letter-congress-handling-minimum-standards-industry-michael-dekort/

Who will get to Autonomous Level 5 First and Why

· https://www.linkedin.com/pulse/who-get-autonomous-level-5-first-why-michael-dekort


In closing I ask that you provide me the courtesy of reading the material I provided and to endeavor to think about it as separate from your ego and pride as possible. Think about it from the point of view of a human being, an engineer, a business owner and a mentor of a vast amount of people. If this weren’t you and someone else was going through this and they asked for your counsel, what would you suggest they do? I am pretty sure that if you made an announcement tomorrow where you explained and owned the mistakes you made, laid out an ethical and workable plan to resolve them, even if it takes another decade, most would understand, be grateful and supportive. And even if my prognostication is wrong. You would have actually saved the lives you intended to save, and many more. You would get back to being that person I you were a short time ago before your ego and pride led you down that slippery slope. You would bounce back and be just fine.


Respectfully


Michael DeKort

This open letter is what happens when Marques Brownlee and Mark David Chapman simultaneously pee into a fountain as lightning strikes.
 
2) Audi's approach - take responsibility for self-driving, but create gradual, piecemeal solutions to get there faster, yet still in a responsible manner (first Level 3 traffic jam pilot, them Level 4 highway pilot and Level 4 parking-lot system)...

Is Audi indemnifying owners and drivers while the vehicle is L3 or L4 mode? If not, I really don't see much difference between what they are doing and what Tesla is doing?