Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety

This site may earn commission on affiliate links.
There's a thread in another automotive forum I'm a member of talking about how young people these days don't have a clue about how cars work and how to look after them. Not knowing what an oil pressure warning light means (it was referred to as the 'genie lamp' or 'gravy boat' by a couple of clueless drivers), not being able to carry out the simplest maintenance tasks such as checking tyre pressures, topping up the washer bottle, all that sort of thing.

There were a few anecdotes where drivers had accidents and in every case it was the car's fault, or because there was mysterious disappearing 'oil' on the road, or in one case due to rain causing the car to end up on it's roof (that must have been one hell of a shower!)

It's debatable whether these issues only apply to young drivers, but on the whole it does seem as though a lot of people these days have no interest how a car works, no idea what actually happens when something goes wrong and no idea how to correct it. They just treat cars like any other 'device', so the owner's manual is never opened and it's up to the garage or a more responsible adult to put things right for them.

That may be fine for devices such as phones, computers and washing machines but cars (at their current stage of development) require a more responsible attitude, so I agree with others here that more driver education is needed. That goes for all cars, not just Teslas and especially for those with new advanced driving aids.

An example - ABS is now fitted to almost every new car, yet most drivers don't ever press the brake pedal hard enough to utilise it in an emergency and even if they do, they don't understand that you can still steer the car when it operates. Most drivers haven't been shown how it works and explaining the principle of it goes over most people's heads anyway. It's a safety feature which needs to be demonstrated for a driver to use it properly.

Some very high performance road cars come with driver training. There seems to be no shortage of wealthy but inexperienced motorists out there buying Ferraris, McLarens, Porsches, Bugattis etc. and some of these supercars end up being totalled or in a fatal crash very soon after leaving the showroom. Rarely (unless a celebrity is involved) do those stories make national news. It's taken for granted that anyone with enough money should be able to buy a car with massive performance and drive it on the road with a regular drivers license, so when those accidents happen, there's no big backlash. There's always lots of debate about gun control, but we don't hear much debate about car control.

So, we do seem to be experiencing a backlash against Tesla at the moment which I feel is mainly media-driven and I'm sure will subside, but we should also look objectively at what's happening here and see if we can learn from the experience.

I personally would welcome some one-to-one Autopilot training if I choose to have it when I get my Model 3 (I'm by no means convinced I actually NEED AP yet). What I mean is more training than you would get on a short test drive on normal roads. Something on a test track or skid pan where the limits of the car and it's driver aids could be explored and real life scenarios tested out before the driver gets into a real situation they don't know how to get out of.

There's more of this sort of training being offered by car manufacturers these days, partly because the cars and their safety systems are becoming more complex and partly because it promotes them as being concerned about their customer's wellbeing. It can only help.

I've been involved in high performance driver training myself and one thing it taught me very early on is that there is a very high proportion of drivers on the roads, some of them with many years of experience, who just don't understand the basic concepts of car control, accident avoidance, hazard perception and planning ahead.

I believe more intensive driver training should at least be offered when a Tesla is purchased (I'm assuming that doesn't happen now, but correct me if I'm wrong). There's no need to disable or restrict the use of Autopilot as long as the driver knows how to use it properly. We can see from cases like the OP that not all Tesla drivers do and currently they don't have enough motivation or opportunity to find out.
 
A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety
From my friend, Mr. Pang, a survivor of the Montana Tesla autopilot crash

My name is Pang. On July 8, 2016, I drove my Tesla Model X from Seattle heading to Yellowstone Nation Park, with a friend, Mr. Huang, in the passenger seat. When we were on highway I90, I turned on autopilot, and drove for about 600 miles. I switched autopilot off while we exited I90 in Montana to state route 2. After about 1 mile, we saw that road condition was good, and turned on autopilot again. The speed setting was between 55 and 60 mph. After we drove about another mile on state route 2, the car suddenly veered right and crashed into the safety barrier post. It happened so fast, and we did not hear any warning beep. Autopilot did not slow down at all after the crash, but kept going in the original speed setting and continued to crash into more barrier posts in high speed. I managed to step on the break, turn the car left and stopped the car after it crashed 12 barrier posts. After we stopped, we heard the car making abnormal loud sound. Afraid that the battery was broken or short circuited, we got out and ran away as fast as we could. After we ran about 50 feet, we found the sound was the engine were still running in high speed. I returned to the car and put it in parking, that is when the loud sound disappeared. Our cellphone did not have coverage, and asked a lady passing by to call 911 on her cellphone. After the police arrived, we found the right side of the car was totally damaged. The right front wheel, suspension, and head light flied off far, and the right rear wheel was crashed out of shape. We noticed that the barrier posts is about 2 feet from the white line. The other side of the barrier is a 50 feet drop, with a railroad at the bottom, and a river next. If the car rolled down the steep slope, it would be really bad.
Concerning this crash accident, we want to make several things clear:
1. We know that while Tesla autopilot is on but the driver's hand is not on the steering wheel, the system will issue warning beep sound after a while. If the driver's hands continue to be off the steering wheel, autopilot will slow down, until the driver takes over both the steering wheel and gas pedal. But we did not hear any warning beep before the crash, and the car did not slow down either. It just veered right in a sudden and crashed into the barrier posts. Apparently the autopilot system malfunctioned and caused the crash. The car was running between 55 and 60 mph, and the barrier posts are just 3 or 4 feet away. It happened in less than 1/10 of a second from the drift to crash. A normal driver is impossible to avoid that in such a short time.
2. I was horrified by the fact that the Tesla autopilot did not slow down the car at all after the intial crash. After we crashed on the first barrier post, autopilot continued to drive the car with the speed of 55 to 60 mph, and crashed another 11 posts. Even after I stopped the car, it was still trying to accelerate and spinning the engine in high speed. What if it is not barrier posts on the right side, but a crowd?
3. Tesla never contacted me after the accident. Tesla just issued conclusion without thorough investigation, but blaming me for the crash. Tesla were trying to cover up the lack of dependability of the autopilot system, but blaming everything on my hands not on the steering wheel. Tesla were not interested in why the car veered right suddenly, nor why the car did not slow down during the crash. It is clear that Tesla is selling a beta product with bugs to consumers, and ask the consumers to be responsible for the liability of the bugging autopilot system. Tesla is using all Tesla drivers as lab rats. We are willing to talk to Tesla concerning the accident anytime, anywhere, in front of the public.
4. CNN's article later about the accident was quoting out of context of our interview. I did not say that I do not know either Tesla or me should be responsible for the accident. I might consider buying another Tesla only if they can iron out the instability problems of their system.
As a survivor of such a bad accident, a past fan of the Tesla technology, I now realized that life is the most precious fortune in this world. Any advance in technology should be based on the prerequisite of protecting life to the maximum extend. In front of life and death, any technology has no right to ignore life, any pursue and dream on technology should first show the respect to life. For the sake of the safety of all Tesla drivers and passengers, and all other people sharing the road, Mr. Musk should stand up as a man, face up the challenge to thoroughly investigate the cause of the accident, and take responsibility for the mistakes of Tesla product. We are willing to publicly talk to you face to face anytime to give you all the details of what happened. Mr. Musk, you should immediately stop trying to cover up the problems of the Tesla autopilot system and blame the consumers.

1093674775.jpg
1910129431.jpg
image.png
This guy has just provided Tesla with a clear admission of neglect. An no Pang, there will be no compensation, unless it's you providing Tesla with compensation for wasting their time.
 
There's a thread in another automotive forum I'm a member of talking about how young people these days don't have a clue about how cars work and how to look after them. Not knowing what an oil pressure warning light means (it was referred to as the 'genie lamp' or 'gravy boat' by a couple of clueless drivers), not being able to carry out the simplest maintenance tasks such as checking tyre pressures, topping up the washer bottle, all that sort of thing.

There were a few anecdotes where drivers had accidents and in every case it was the car's fault, or because there was mysterious disappearing 'oil' on the road, or in one case due to rain causing the car to end up on it's roof (that must have been one hell of a shower!)

It's debatable whether these issues only apply to young drivers, but on the whole it does seem as though a lot of people these days have no interest how a car works, no idea what actually happens when something goes wrong and no idea how to correct it. They just treat cars like any other 'device', so the owner's manual is never opened and it's up to the garage or a more responsible adult to put things right for them.

That may be fine for devices such as phones, computers and washing machines but cars (at their current stage of development) require a more responsible attitude, so I agree with others here that more driver education is needed. That goes for all cars, not just Teslas and especially for those with new advanced driving aids.

An example - ABS is now fitted to almost every new car, yet most drivers don't ever press the brake pedal hard enough to utilise it in an emergency and even if they do, they don't understand that you can still steer the car when it operates. Most drivers haven't been shown how it works and explaining the principle of it goes over most people's heads anyway. It's a safety feature which needs to be demonstrated for a driver to use it properly.

Some very high performance road cars come with driver training. There seems to be no shortage of wealthy but inexperienced motorists out there buying Ferraris, McLarens, Porsches, Bugattis etc. and some of these supercars end up being totalled or in a fatal crash very soon after leaving the showroom. Rarely (unless a celebrity is involved) do those stories make national news. It's taken for granted that anyone with enough money should be able to buy a car with massive performance and drive it on the road with a regular drivers license, so when those accidents happen, there's no big backlash. There's always lots of debate about gun control, but we don't hear much debate about car control.

So, we do seem to be experiencing a backlash against Tesla at the moment which I feel is mainly media-driven and I'm sure will subside, but we should also look objectively at what's happening here and see if we can learn from the experience.

I personally would welcome some one-to-one Autopilot training if I choose to have it when I get my Model 3 (I'm by no means convinced I actually NEED AP yet). What I mean is more training than you would get on a short test drive on normal roads. Something on a test track or skid pan where the limits of the car and it's driver aids could be explored and real life scenarios tested out before the driver gets into a real situation they don't know how to get out of.

There's more of this sort of training being offered by car manufacturers these days, partly because the cars and their safety systems are becoming more complex and partly because it promotes them as being concerned about their customer's wellbeing. It can only help.

I've been involved in high performance driver training myself and one thing it taught me very early on is that there is a very high proportion of drivers on the roads, some of them with many years of experience, who just don't understand the basic concepts of car control, accident avoidance, hazard perception and planning ahead.

I believe more intensive driver training should at least be offered when a Tesla is purchased (I'm assuming that doesn't happen now, but correct me if I'm wrong). There's no need to disable or restrict the use of Autopilot as long as the driver knows how to use it properly. We can see from cases like the OP that not all Tesla drivers do and currently they don't have enough motivation or opportunity to find out.
Let me start off by saying that I agree that many people don't know how to drive, and need to be in situations so they know what to do.

When I was just learning how to drive, my dad took me out into a parking lot covered in ice and snow. And he told me hit the gas, then the brakes. Do that while turning. Let go of the brakes while turning. Do this, do that, etc. I was like "ok, this is cool, I'll never need this". A few months later I was driving down the road, going for a left turn, and it was snowing. I applied the brakes to slow down for the turn, mid-turn, only to realize that I just locked the wheels and my car was headed directly at a bus stop with people. Turning the wheel did nothing. In that split second, I remembered what I was taught, let go of the brakes, and avoided getting into a very serious accident.
So I 100% agree that training helps in those corner situations that you don't encounter everyday. And "knowing" what to do, and actually trying to do it are two very different things.

With that being said, there are 2 issues with your idea:
1. Who's going to pay for it? If it's included in the price of the car, the price of the car just went up -- and people will complain. You'll also have the crowd of people who think they're driving experts, and don't need some silly class to teach them how to be better.
Also, you'd need a dedicated track/driving area for where to test. In large cities that's going to hard and expensive.
2. A drivers license is given to pretty much anyone in the US, you just need to not kill your instructor (hyperbole of course, but not too far from the truth). I think in the UK and other European countries there's more that's required than just a test to get your license, no? I wouldn't want someone who barely knows how to drive, to be on a race course testing the limits of the car, even with an instructor.
 
Agree, timing is important. Even if your hands are on the wheel, and the car swerves, you may not have enough time at freeway speeds for your brain to instruct your muscles to take control.

My 65 ford had enough play in the steering that it felt like there was always a surprise, esp with bias ply tires. Never had a problem reacting in time hands in wheel, "feeling" the road, brain engaged.
 
  • Like
Reactions: bobjustbob
1. Who's going to pay for it? If it's included in the price of the car, the price of the car just went up -- and people will complain. You'll also have the crowd of people who think they're driving experts, and don't need some silly class to teach them how to be better.
Also, you'd need a dedicated track/driving area for where to test. In large cities that's going to hard and expensive.

I'm not suggesting Tesla do, although maybe they could subsidise the cost. Very few manufacturers include any driver training in the cost of the car, but it's not beyond the realms of possibility if it's organised in the right way to keep the cost per head to a minimum.
Unless it's made mandatory (which I think is unrealistic) you'll always get the driving gods who know it all. They can sign a disclaimer when they buy the car to say they have read the owners manual from cover to cover and understand fully how to properly use all features of the car :rolleyes:

You'd be surprised how many venues are around which could be suitable for some basic driver training. it doesn't need to be a track as such. Even a large parking lot could be used. Motorsport events are held all over the place, even in a small country like the UK with limited facilities. Driver training is less of a nuisance than racing/rallying, particularly with EVs as they make less noise. You just need a large, flat piece of tarmac and some traffic cones. I know, I've done it!

2. A drivers license is given to pretty much anyone in the US, you just need to not kill your instructor (hyperbole of course, but not too far from the truth). I think in the UK and other European countries there's more that's required than just a test to get your license, no? I wouldn't want someone who barely knows how to drive, to be on a race course testing the limits of the car, even with an instructor.

I did the Florida test many years ago. A multi-choice test followed by a drive around the block and a parallel parking test. No wonder Florida's roads are considered dangerous! TBH, the driving test in the UK is not a lot better. You just need to see new drivers on the roads to realise that.

As I said above, I'm not really talking about putting novices on a race track as such but if you get them early enough before they develop bad habits, you would be surprised how well young drivers do in a controlled environment being taught proper car control. I've been a race instructor and I've sat next to 17 year olds driving Porsches around race tracks, so I do know what I'm talking about in this respect.
 
  • Like
Reactions: Magus and Max*
Why all the hate? This person should be allowed to present his side of the story without being personally attacked.

It's surprising Tesla didn't even reach out to the driver. Doesn't Tesla want the full story to find out what went wrong so they can improve the system?
The first thing the person should have done is create their own account not go through some proxy to convey their message. My bet is that Tesla will say they attempted to contact owner but never heard back similar to the art gallery owner that also claimed AutoPilot crashed his car. I believe Tesla's protocol is to contact owner immediately when logs indicate there has been a crash to make sure they are unharmed.
 
There's a little hate, but really not much more than you find in any internet discussion. What there is is a lot of skepticism, and, IMO,
well-deserved skepticism. Questioning someone's story doesn't, by itself, constitute a personal attack. There are many things in his story that just don't appear to add up, so at the very least he needs to provide some clarification and elaboration if we wants
this community to take him seriously.
 
  • Like
Reactions: bobjustbob
Let me posit a thought experiment: assume, for a moment, that Autopilot *is* beta (it is) software, and it has bugs (I guarantee you, as a software engineering veteran, that it has hundreds, if not thousands, of bugs): is it not possible that the OP is stating exactly what happened?

If the autopilot got stuck in a loop, or had a memory error (say, from a random alpha particle strike that clobbered just a few more bits than the ECC can handle), or any other from a practically infinite set of failure modes the brilliant programmers at Tesla didn't handle, then it is entirely plausible that his car jerked to the side, and entirely plausible that it happened without warnings and too quickly to react to. Its even plausible that his motor kept running after the accident. It's a software-driven car, all bets are off when you hit a major bug.

I'm a fan, and a believer: autonomic driving will come, sooner than we expect. But it will have bugs, especially as Tesla (and society in general) moves towards stochastic AI-based programming models (meaning, a - we don't know how the code actually works, b - its *stochastic*, which is a fancy word for probabilistic, which means it usually works - you can get close to working 100% of the time, but you can't quite get there).

I really worry Tesla's current approach to handling these situations: "we looked at the telemetry, and it proves X". If there is a bug, then the telemetry can't be trusted. When Tesla says "look, the telemetry shows the driver wasn't steering", that just means thats what the car thought the driver was doing, not what was actually happening.

When we, as a community, jump to defend Tesla, its understandable (we love that company!), but sometimes we should pause and consider that maybe there *is* something there.
Probably because there are known limitations and likely bugs (or better put, unaccounted for/untested/unintended use cases), users are guided directly to NOT use it in specific situations like this one. The driver chose to ignore this guidance.
 
Just to follow on from my earlier posts, it's made me think about something I saw a few weeks ago. I was driving on a local road (fairly narrow with two way traffic) and came up behind a Mini Cooper with a 'P' plate on, which is a voluntary sign new drivers can put on the back of their car to signify they've just passed their test. The idea is so that other drivers will give them a bit more room and not get impatient if they make small mistakes. That's the idea, anyway...

Now first off, a Mini Cooper is a fairly quick car to have if you've only just passed your test, but that's a different conversation.

What I then witnessed was the Mini being driven quite erratically and specifically driving way too close to the nearside kerb, at some points clipping hedges by the side of the road and dropping a wheel into the gutter and almost up onto the sidewalk. This continued for a good few miles through towns and on country roads. I kept my distance but made a note to pass as soon as it was safe to do so as I really didn't want to be following a liability.

Naturally, as soon as we came to nice straight bit of road, the driver would put their foot down so I couldn't safely get past. I was pretty sure this was deliberate, although some people do drive like that naturally.

I eventually picked my moment and made it past, feeling a lot safer with that Mini behind me.

So, in the context of this discussion, here's a driver with a fairly quick car but not a lot of experience who is obviously either not completely confident driving in narrow lanes or just can't judge the width of their car very well. That driver is going to have some sort of accident at some point. Maybe they'll just scuff their rims, but they could hit a pedestrian with their side mirror or mount the sidewalk and lose control completely. Something is going to happen if they continue to drive like that with no-one there to help them. But they obviously did well enough on their test to pass.

If that driver had a system like Autopilot in their car, they could benefit from it not by enabling it and paying LESS attention to the road, but by paying MORE attention to the way the system steers, controls speed, brakes etc. and actually learning how to drive better without having an instructor or more experienced driver in the car with them. So Autopilot could proactively improve driving standards if used in the right way. Just a thought and maybe another way to look at all of this.
 
Let me posit a thought experiment: assume, for a moment, that Autopilot *is* beta (it is) software, and it has bugs (I guarantee you, as a software engineering veteran, that it has hundreds, if not thousands, of bugs): is it not possible that the OP is stating exactly what happened?

If the autopilot got stuck in a loop, or had a memory error (say, from a random alpha particle strike that clobbered just a few more bits than the ECC can handle), or any other from a practically infinite set of failure modes the brilliant programmers at Tesla didn't handle, then it is entirely plausible that his car jerked to the side, and entirely plausible that it happened without warnings and too quickly to react to. Its even plausible that his motor kept running after the accident. It's a software-driven car, all bets are off when you hit a major bug.

I'm a fan, and a believer: autonomic driving will come, sooner than we expect. But it will have bugs, especially as Tesla (and society in general) moves towards stochastic AI-based programming models (meaning, a - we don't know how the code actually works, b - its *stochastic*, which is a fancy word for probabilistic, which means it usually works - you can get close to working 100% of the time, but you can't quite get there).

I really worry Tesla's current approach to handling these situations: "we looked at the telemetry, and it proves X". If there is a bug, then the telemetry can't be trusted. When Tesla says "look, the telemetry shows the driver wasn't steering", that just means thats what the car thought the driver was doing, not what was actually happening.

When we, as a community, jump to defend Tesla, its understandable (we love that company!), but sometimes we should pause and consider that maybe there *is* something there.
The problem with your logic is: Tesla only has certain information to work on. Everything else is fluff, or unknowns. There can be no guesses here. Like in a criminal trial one must follow the evidence and not suppose what could have happened. That's the defense attorney's job. So Tesla's only course of action in these cases is to take the logs as evidence and follow it. The rest of us can throw out infinite possibilities
 
Why all the hate? This person should be allowed to present his side of the story without being personally attacked.

If that's all they did, I don't think there would be much hate. But they not only presented their side of the story, they accused Tesla of covering up problems with Autopilot. Combine that with the fact that the crash was obviously caused by criminally stupid behavior on the part of the driver, and strong criticism is both expected and justified.
 
Now first off, a Mini Cooper is a fairly quick car to have if you've only just passed your test, but that's a different conversation.
Nimble, yes. Quick? That's debatable even the S.

What I then witnessed was the Mini being driven quite erratically and specifically driving way too close to the nearside kerb, at some points clipping hedges by the side of the road and dropping a wheel into the gutter and almost up onto the sidewalk. This continued for a good few miles through towns and on country roads. I kept my distance but made a note to pass as soon as it was safe to do so as I really didn't want to be following a liability.
I don't know if they have addressed this in newer models but I still have a 2006 S and they're notorious for needing their motor mounts, strut mounts and struts replaced. I had two rims replaced due to potholes. Most likely the person will need alignment done too.

Back on topic, imagine if this person were to drive a big car like Model S.
 
"Why all the hate?"

Doing stupid things that is expressly prohibited by Tesla - driving in AP in conditions not suitable - and then blaming Tesla for crash and demanding compensation and taking it to media to do further damage?. what Love do you expect from this crowd?

Sure they can go to Seeking Alpha, WSJ or number of rags with Koch fueled money and is keen on seeing Tesla fail, and make their case, and I am sure they will be welcomed with open arms.
 
  • Disagree
  • Like
Reactions: PAULL and Atebit
If that's all they did, I don't think there would be much hate. But they not only presented their side of the story, they accused Tesla of covering up problems with Autopilot. Combine that with the fact that the crash was obviously caused by criminally stupid behavior on the part of the driver, and strong criticism is both expected and justified.

"criminally stupid behavior?"

Well that's not good because by that standard, YouTube is filled with videos of AutoPilot being used in a "criminally stupid manner". This has been the case for almost a year now without an aggressive response by Tesla to stop it, either by education or by software updates.
 
Yes, using Autopilot hands-off on a narrow, twisty road with a barrier at the edge is criminally stupid. Do you disagree?

YouTube is full of criminally stupid behavior. Just search "burnout crash" and enjoy an endless parade of examples. People have been irresponsibly crashing their Mustangs for ages now without an aggressive response by Ford to stop it. Where's all the outrage there?

It's not Tesla's responsibility to save us from ourselves. If they can, it would be great, but they are not to blame when someone uses the system so badly.
 
Yes, using Autopilot hands-off on a narrow, twisty road with a barrier at the edge is criminally stupid. Do you disagree?

YouTube is full of criminally stupid behavior. Just search "burnout crash" and enjoy an endless parade of examples. People have been irresponsibly crashing their Mustangs for ages now without an aggressive response by Ford to stop it. Where's all the outrage there?

It's not Tesla's responsibility to save us from ourselves. If they can, it would be great, but they are not to blame when someone uses the system so badly.

Maybe Tesla should add that warning in the fine print.

If you don't have your hands on the wheel of a product that is marketed as "auto piloting" and "almost twice as safe as a person" and anything bad happens, you will be considered a criminal. Nice.