Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety

This site may earn commission on affiliate links.
If what he say's is true that would be very disconcerting at the least. AP suddenly turns right, strikes several posts and continues attempting to accelerate. Almost sounds like someone hacked the X and remotely drove it. This makes no sense to me Mr. Pang and i'm not calling you a liar. Would Tesla cover this up if true. Damn right! I hope there is a better explanation for this. From what we already know it sounded like the driver had been reckless, which is why he was issued a summons.

I've been debating on whether or not I wanted to jump in to this thread (or any thread) with my experience that is quite similar to this description...

About 3 weeks ago, my P90DL did almost an identical manoeuvre as described in OP here... driving along (I was on a divided highway, median in the middle, 2 lanes going each direction, center lines and stripes on the sides, bright, sunny day around noon, etc... it was perfect Autopilot conditions.), my car crossed a 6 foot section of replaced road that had no lines and suddenly veered right straight into traffic cones, knocking my passenger side mirror loose. As soon as the car started veering right I jerked the wheel to the left, but it was so sudden and unexpected that the car struck one of the cones before I could pull it back.

Yes, I have full dashcam video of the incident, you can hear the bong-bing of my taking it out of autopilot as soon as it starts to dive for the cones and hits a cone and everyone in the car asking what the F@#$% just happened. I have submitted it to Tesla, but their response has been less than desirable. It's been basically non-existent beyond "Yes, it's a known problem."

I've been hesitant to post the video or say anything publically due to all the bad press that's going around currently and my local SC has always been very good to me. I don't necessarily want to add fuel to that fire or throw anyone under the bus. However, the longer I go without hearing anything useful out of Tesla makes me reconsider posting this description and video publically. I'm not saying it was a bad accident or a serious incident. It was minor in the grand scheme of things, but it really made me reconsider the reliability of autopilot. Even a note from Tesla saying "Oh, the car thought there was an Elk about to jump out and veered to the right." is better than "Known problem. Sorry."
 
Last edited:
As road conditions became increasingly uncertain, the vehicle again alerted you to put your hands on the wheel. No steering torque was then detected until Autosteer was disabled with an abrupt steering action.

it was perfect Autopilot conditions.), my car crossed a 6 foot section of replaced road that had no lines and suddenly veered right straight into traffic cones, knocking my passenger side mirror loose. As soon as the car started veering right I jerked the wheel to the left,

I'm no expert, but the way I read it, Tesla are saying that Pang disabled Autosteer with a steering input. You are saying Autosteer directed your car into traffic cones before you took over and corrected.

Have I got that right?

In both cases, from what I have so far read about Autopilot and Autosteer, the driver should be paying attention at all times and be prepared to take over/correct anything that Autopilot/Autosteer does.

Please correct me if I have misunderstood.
 
  • Disagree
Reactions: bhzmark and Troy
I've been debating on whether or not I wanted to jump in to this thread (or any thread) with my experience that is quite similar to this description...

About 3 weeks ago, my P90DL did almost an identical manoeuvre as described in OP here... driving along (I was on a divided highway, median in the middle, 2 lanes going each direction, center lines and stripes on the sides, bright, sunny day around noon, etc... it was perfect Autopilot conditions.), my car crossed a 6 foot section of replaced road that had no lines and suddenly veered right straight into traffic cones, knocking my passenger side mirror loose. As soon as the car started veering right I jerked the wheel to the left, but it was so sudden and unexpected that the car struck one of the cones before I could pull it back.

Yes, I have full dashcam video of the incident, you can hear the bong-bing of my taking it out of autopilot as soon as it starts to dive for the cones and hits a cone and everyone in the car asking what the F@#$% just happened. I have submitted it to Tesla, but their response has been less than desirable. It's been basically non-existent beyond "Yes, it's a known problem."

I've been hesitant to post the video or say anything publically due to all the bad press that's going around currently and my local SC has always been very good to me. I don't necessarily want to add fuel to that fire or throw anyone under the bus. However, the longer I go without hearing anything useful out of Tesla makes me reconsider posting this description and video publically. I'm not saying it was a bad accident or a serious incident. It was minor in the grand scheme of things, but it really made me reconsider the reliability of autopilot. Even a note from Tesla saying "Oh, the car thought there was an Elk about to jump out and veered to the right." is better than "Known problem. Sorry."
You have made what sounds to me like a very reasonable post and have already received a response saying "driver should be paying attention at all times and be prepared to take over ..." a clear suggestion that you were sloppy or negligent. I think that is quite unfair. I think you should be commended for letting us know about this "known problem".
 
You have made what sounds to me like a very reasonable post and have already received a response saying "driver should be paying attention at all times and be prepared to take over ..." a clear suggestion that you were sloppy or negligent. I think that is quite unfair. I think you should be commended for letting us know about this "known problem".
Couldn't agree more and I hope it doesn't continue.
 
However, the longer I go without hearing anything useful out of Tesla makes me reconsider posting this description and video publically. I'm not saying it was a bad accident or a serious incident. It was minor in the grand scheme of things, but it really made me reconsider the reliability of autopilot. Even a note from Tesla saying "Oh, the car thought there was an Elk about to jump out and veered to the right." is better than "Known problem. Sorry."

First, thanks for posting...the media will find a way to shiat on Tesla whether you post or not. However, I'm confused as to what you expect to hear. If it's for them to pay for the damage, I would have to say they can't know the exact conditions surrounding the "accident" so I don't think it's reasonable to expect them to just "pay up." If it's that you expect to hear they are actively working on making sure that the vehicle is no longer confused by abnormal road conditions...well, that's occurring every second of every day.

So, while I sympathize with the damage...I don't really understand the concern.
 
First, thanks for posting...the media will find a way to shiat on Tesla whether you post or not. However, I'm confused as to what you expect to hear. If it's for them to pay for the damage, I would have to say they can't know the exact conditions surrounding the "accident" so I don't think it's reasonable to expect them to just "pay up." If it's that you expect to hear they are actively working on making sure that the vehicle is no longer confused by abnormal road conditions...well, that's occurring every second of every day.

So, while I sympathize with the damage...I don't really understand the concern.
By saying it is a "known issue" it says to me its something they know about and are working on. No where in his post did I see any inference of blame. I am happy when people report things like this to Tesla because it benefits all of us.
 
I was paying attention at all times and I was prepared to take over - hence why I was able to react and only hit the mirror. The dive for the cones was sudden. It goes from my lane to being about 3 feet in the other lane in the space of 2 and a half dashed lines. In the video, as soon as the car starts to dive you can hear the bong-bing of autopilot being disabled, but by then the car was already across the lines and headed for a cone



Well, I would like them to replace the mirror or fix it if it's fixable. As I said, it's not a big deal as far as damage goes... but to address the "they can't know the exact conditions surrounding the accident", they most certainly can. I have 1080P dashcam front and rear + whatever logging Tesla does for the car. I'm sure they can know the exact conditions better than I did at the specific point in time. They should easily be able to correlate their logging data with the camera data to determine exactly what happened.

As far as what I would like, it would be that they acknowledge that it's a problem and work on a fix, since it can potentially be a serious issue. To me privately, I've heard that it's a known problem, but publically all I've heard is that Autopilot is working as intended, which is ... not really correct. In this instance, a dive to the right isn't a huge deal, but I can easily imagine a more serious consequence for such a bug. If it's a known and acknowledged problem, to me I'd want to make it a high priority for a fix - something I've seen no evidence of up to this point. Heck, if they just gave me a detailed account of what happened and why, I'd be satisfied with that.

Either way, I'm content to leave it be for now and await 8.0 and/or an AP update that adds significant new functionality (such as Radar distance ranging, etc...). But at some point, if there's no imminent fix, they need to warn people about this issue. I'd have definitely been on my guard if I had known about it prior to this.

My experience in using AP is that the most important thing is to choose where to use AP, even on divided highways. I am always extra careful in construction zones because lanes may disappear. I apply brake early when I see sudden traffic stop ahead. So for me, AP has made my trips safer. The technology is not perfect and can't handle challenging conditions. In the meantime, human drivers are imperfect too. That's why we should use AP as an assistant with our own good judgement.
 
I've been debating on whether or not I wanted to jump in to this thread (or any thread) with my experience that is quite similar to this description...

About 3 weeks ago, my P90DL did almost an identical manoeuvre as described in OP here... driving along (I was on a divided highway, median in the middle, 2 lanes going each direction, center lines and stripes on the sides, bright, sunny day around noon, etc... it was perfect Autopilot conditions.), my car crossed a 6 foot section of replaced road that had no lines and suddenly veered right straight into traffic cones, knocking my passenger side mirror loose. As soon as the car started veering right I jerked the wheel to the left, but it was so sudden and unexpected that the car struck one of the cones before I could pull it back.

Yes, I have full dashcam video of the incident, you can hear the bong-bing of my taking it out of autopilot as soon as it starts to dive for the cones and hits a cone and everyone in the car asking what the F@#$% just happened. I have submitted it to Tesla, but their response has been less than desirable. It's been basically non-existent beyond "Yes, it's a known problem."

I've been hesitant to post the video or say anything publically due to all the bad press that's going around currently and my local SC has always been very good to me. I don't necessarily want to add fuel to that fire or throw anyone under the bus. However, the longer I go without hearing anything useful out of Tesla makes me reconsider posting this description and video publically. I'm not saying it was a bad accident or a serious incident. It was minor in the grand scheme of things, but it really made me reconsider the reliability of autopilot. Even a note from Tesla saying "Oh, the car thought there was an Elk about to jump out and veered to the right." is better than "Known problem. Sorry."

It was quite serious - what if those cones had been humans? Or steel girders? Or telephone poles? Or baby carriages?

Please post the video. That is exactly what this forum is for.
 
Reading this - I'm here because of Pang's post - is disturbing. Because:
1. Racism - too many posts talking about Pang's possible lack of English acumen - down to not being able to 'understand the manuals'
2. Victim blaming - Pang's case, and other Tesla crashes, seems most Tesla fans are blaming the driver in 'Autopilot' cases
3. Don't report - as in the case of Naonak above - Tesla fans would rather not report incidents than to see Tesla admit to and correct mistakes

The initial lack of rationale seen after the major Autopilot fatality, as seen in people saying 'its just one fatality', has continued. One fatality for a company that makes less than 100,000 cars a year is too many. Failing to see a large truck in broad daylight and crashing under it is a material event with massive consequences.

I was the dude telling people about Tesla and Elon Musk for the past 2-3 years over drinks, parties, meetings, etc. Now, there's a lot about this company and its followers that's disturbing.
 
Reading this - I'm here because of Pang's post - is disturbing. Because:
1. Racism - too many posts talking about Pang's possible lack of English acumen - down to not being able to 'understand the manuals'
2. Victim blaming - Pang's case, and other Tesla crashes, seems most Tesla fans are blaming the driver in 'Autopilot' cases
3. Don't report - as in the case of Naonak above - Tesla fans would rather not report incidents than to see Tesla admit to and correct mistakes

The initial lack of rationale seen after the major Autopilot fatality, as seen in people saying 'its just one fatality', has continued. One fatality for a company that makes less than 100,000 cars a year is too many. Failing to see a large truck in broad daylight and crashing under it is a material event with massive consequences.

I was the dude telling people about Tesla and Elon Musk for the past 2-3 years over drinks, parties, meetings, etc. Now, there's a lot about this company and its followers that's disturbing.
To be clear on Naonak's post; its very insightful and needs to be reported and the video made public.
 
The topic of English comprehension was actually raised by Pang in one of the media interviews he did following the accident.
Thank you for posting one point on one out of three issues raised, with a vague defence of the said point, which I will in turn defend by saying there's no verifiable proof that he was not English-proficient and there's a public letter that he wrote in good English in this very forum. Thank you for your insight.
 
Thank you for posting one point on one out of three issues raised, with a vague defence of the said point, which I will in turn defend by saying there's no verifiable proof that he was not English-proficient and there's a public letter that he wrote in good English in this very forum. Thank you for your insight.
That's apparently written by the original poster of this thread (the open letter) on behalf of his friend, Mr. Pang, and not by Mr. Pang himself. In Tesla's response to the open-letter (post #92 above), they said they had called Mr. Pang on the day of the accident and talked to him through his interpreter. A couple of days later, Tesla had a Mandarin speaking employee call his house again and this time they spoke to his wife. So, yes, there is evidence that Mr. Pang is not fluent in English.

Whether or not that lack of language proficiency contributed to the accident is another matter.
 
Last edited:
Dear Mr. Pang,

We were sorry to hear about your accident, but we were very pleased to learn both you and your friend were ok when we spoke through your translator on the morning of the crash (July 9). On Monday immediately following the crash (July 11), we found a member of the Tesla team fluent in Mandarin and called to follow up. When we were able to make contact with your wife the following day, we expressed our concern and gathered more information regarding the incident. We have since made multiple attempts (one Wednesday, one Thursday, and one Friday) to reach you to discuss the incident, review detailed logs, and address any further concerns and have not received a call back.

As is our standard procedure with all incidents experienced in our vehicles, we have conducted a thorough investigation of the diagnostic log data transmitted by the vehicle. Given your stated preference to air your concerns in a public forum, we are happy to provide a brief analysis here and welcome a return call from you. From this data, we learned that after you engaged Autosteer, your hands were not detected on the steering wheel for over two minutes. This is contrary to the terms of use when first enabling the feature and the visual alert presented you every time Autosteer is activated. As road conditions became increasingly uncertain, the vehicle again alerted you to put your hands on the wheel. No steering torque was then detected until Autosteer was disabled with an abrupt steering action. Immediately following detection of the first impact, adaptive cruise control was also disabled, the vehicle began to slow, and you applied the brake pedal.

Following the crash, and once the vehicle had come to rest, the passenger door was opened but the driver door remained closed and the key remained in the vehicle. Since the vehicle had been left in Drive with Creep Mode enabled, the motor continued to rotate. The diagnostic data shows that the driver door was later opened from the outside and the vehicle was shifted to park. We understand that at night following a collision the rotating motors may have been disconcerting, even though they were only powered by minimal levels of creep torque. We always seek to learn from customer concerns, and we are looking into this behavior to see if it can be improved. We are also continually studying means of better encouraging drivers to adhere to the terms of use for our driver assistance features.

We are still seeking to speak with you. Please contact Tesla service so that we can answer any further questions you may have.

Sincerely,
The Tesla team

The driver is primarily at fault for applying Autopilot in an inappropriate situation. But it seems they don't speak English - and the person who wrote their public statement's English is also pretty poor (I count about 30 grammatical errors). In fact it's so poor that it's causing more confusion! "after it crashed 12 barrier posts" - you waited till it crashed into 12 posts? Or do you mean "by the time I'd corrected the steer it had already hit 12 posts".
They probably clicked through warnings a bit like people clicking through Microsoft (or anything else) license / user agreements. Clicking yeah fine - let me use it let me use it!!! They had a new car with a new gimmick and wanted it to drive them home.

Presumably if these are posts down the side of the road and the car is doing 55-60mph the proximity sensors around the car are going to be very confused by posts flashing up (if the range is aprox 12ft/4m). The camera might not be working well in the night - so it won't be seeing the edge of the road properly either. The camera might not be seeing the white lines at night. They may not be complete. The forward radar isn't seeing a continuous barrier at the sides.

They were doing 55-60mph. The barriers were 10 feet apart. Physics would tell you if you hit something on the right hand side it's going to cause inertia that will apply a turning force that will be steering you into the next post, and the next, and the next. Being 10ft apart, at 55mph you're going to be hitting about 5 a second - if you hit one you're going to hit lots until the car comes to a stop!!!
What caused the car to hit the first is you using auto-pilot on an inappropriate road, at night!
It came to a stop in 2 seconds - yeah sounds right. Given the late night drive, the fact you'd been driving for 500 hours - how was your attention? You obviously didn't want to be driving yourself.

1 way Tesla is at fault. Calling it auto-pilot when it's not. Autopilot that requires drivers hands on the wheel is by definition not aeroplane style auto-pilot. Beta or not beta. Tesla has an interest in selling software ($2000 and zero cost to them for each incremental sale) to as many car buyers as possible. Autpilot implies too much. A plane can fly autopilot from just after take off for 12hours - but the sky at 37,000ft does not have children playing ball, cyclists, joggers, fire-engines, old-ladies, curbs, trees, road-markings, etc to cause a crash in <1 second. Autopilot helps sell it.

Personally I think with "autopilot" people will become lazier behind the wheel and less attentive - start playing Pokemon Go, reading their phone, watching the clouds, etc - so contrary to causing less accidents "autopilot" (to start with) might cause more accidents. Some types of crashes only people will perceive for now, and possibly forever. Damp road, bit of fog, parked cars everywhere - will autopilot be looking for children playing ball in the side street. There's a cycle event going on. Will autopilot become more cuatious?

Some people claim seatbelts + airbags + SUV's embody people with a false sense of safety - they're in their safe cocoon and will drive with less caution. Drive faster in less safe conditions.

Also - one thing I don't understand though ... I thought if no pressure is felt on the steering wheel then the car will bring itself to a halt and apply the emergency flashers? I thought this was a driver-aid incase the person has passed out / suffering a medical emergency. 2 minutes is a long time - why didn't the car slow itself down thinking the driver is unconscious?
 
Last edited:
  • Like
Reactions: scottf200
Reading this - I'm here because of Pang's post - is disturbing. Because:
1. Racism - too many posts talking about Pang's possible lack of English acumen - down to not being able to 'understand the manuals'
2. Victim blaming - Pang's case, and other Tesla crashes, seems most Tesla fans are blaming the driver in 'Autopilot' cases
3. Don't report - as in the case of Naonak above - Tesla fans would rather not report incidents than to see Tesla admit to and correct mistakes

The initial lack of rationale seen after the major Autopilot fatality, as seen in people saying 'its just one fatality', has continued. One fatality for a company that makes less than 100,000 cars a year is too many. Failing to see a large truck in broad daylight and crashing under it is a material event with massive consequences.

I was the dude telling people about Tesla and Elon Musk for the past 2-3 years over drinks, parties, meetings, etc. Now, there's a lot about this company and its followers that's disturbing.

I hadn't posted - but to defend this...
1a. If the driver is being asked (presumably in English) to accept "user acceptance limitations" before enabling it - and ignoring it / using it inappropriately then that is their fault. Did the driver have Malay as the default language setting in the car so they could read the warnings?
1b. Their english "spokesperson" is not very good at English and is confusing things with their lack of english. It adds more confusion having a 2nd person relay it due to chinese-whisper issues anyway.
Some examples from the first paragraph:
"The speed setting was between 55 and 60 mph" - " We had set the speed to ..." / "the speed was set to"
"and continued to crash into more barrier posts in high speed" - "at high speed" (speed is not a gear)
"after it crashed 12 barrier posts" - what??? you waited till it crashed into 12 posts? Or do you mean "by the time I'd corrected the steer it had already hit 12 posts".
"short circuited" - "had short-circuited"
"After we ran about 50 feet, we found the sound was the engine" - you did not "found" (unless you'd examined the car) - you meant "realised"
" I returned to the car and put it in parking" - "it in park"
 
I've been debating on whether or not I wanted to jump in to this thread (or any thread) with my experience that is quite similar to this description...

About 3 weeks ago, my P90DL did almost an identical manoeuvre as described in OP here... driving along (I was on a divided highway, median in the middle, 2 lanes going each direction, center lines and stripes on the sides, bright, sunny day around noon, etc... it was perfect Autopilot conditions.), my car crossed a 6 foot section of replaced road that had no lines and suddenly veered right straight into traffic cones, knocking my passenger side mirror loose. As soon as the car started veering right I jerked the wheel to the left, but it was so sudden and unexpected that the car struck one of the cones before I could pull it back.

Yes, I have full dashcam video of the incident, you can hear the bong-bing of my taking it out of autopilot as soon as it starts to dive for the cones and hits a cone and everyone in the car asking what the F@#$% just happened. I have submitted it to Tesla, but their response has been less than desirable. It's been basically non-existent beyond "Yes, it's a known problem."

I've been hesitant to post the video or say anything publically due to all the bad press that's going around currently and my local SC has always been very good to me. I don't necessarily want to add fuel to that fire or throw anyone under the bus. However, the longer I go without hearing anything useful out of Tesla makes me reconsider posting this description and video publically. I'm not saying it was a bad accident or a serious incident. It was minor in the grand scheme of things, but it really made me reconsider the reliability of autopilot. Even a note from Tesla saying "Oh, the car thought there was an Elk about to jump out and veered to the right." is better than "Known problem. Sorry."

of course - the above could read like an attack ad from the Koch (sp?) brothers trying to attack Tesla in the forums. :)
I haven't seen your other posts - but without evidence anybody who says:
1. other evidence of a crash (all be-it minor) in similar circumstances
2. hear-say of Tesla saying it's a "known problem"
3. says they're a Tesla fan - but is "scared of the truth"
... panders to that side as much as helping.
Be interesting to see the footage. Preceed it with a screen just stating that this is what you were doing, no driver inputs, and add no-one was hurt, damage was minor.
 
You have made what sounds to me like a very reasonable post and have already received a response saying "driver should be paying attention at all times and be prepared to take over ..." a clear suggestion that you were sloppy or negligent. I think that is quite unfair. I think you should be commended for letting us know about this "known problem".

If my post came across like that, it wasn't meant to and I apologise.

What I was trying to get at was that the onus is always on the driver to pay attention and be ready to take control if something unexpected might happen. In my mind, that could be because;

1. Autosteer is operating correctly but misinterprets the layout of the road ahead and steers the car in an unexpected direction.
2. Autosteer develops a fault and despite road layout being optimal, steers the car in an unexpected direction.
3. Autosteer has been enabled on a road layout not recommended for it's use and as a result steers the car in an unexpected direction.

I think most of us are clear that condition 3 is negligent and the driver shouldn't even have enabled Autosteer in that situation.

Let's assume at this point that condition 2 is the least likely as I haven't read any reports of Autosteer actually failing in service and causing an accident yet.

So, that leaves 1 and I think there's going to be quite a few situations cropping up where Autosteer does something unexpected and as a result the driver feels they have been put into a dangerous situation, or could have been if they didn't manually correct in time.

So for me, the real question is whether it is obvious to drivers currently using Autosteer what constitutes a road layout which might lead to the software getting confused. Because if it is obvious and the driver is paying attention, they will be able to anticipate the problem and take action before Autosteer gets it wrong.

If it isn't obvious and lots of these cases crop up, then something needs to be done about it otherwise the more Teslas are sold with AP, we're going to get more and more issues. Even if the software is continually being improved. I can't see a situation where one release of software has bugs and in the next release they're all gone. 'Beta' means it's going to go wrong at some point. But if by going wrong it's putting drivers into dangerous situations because they don't have time to react then that is not acceptable in my mind.

Anyway, i think the best thing to do is for Naonak to post the video so we can all see what happened and judge for ourselves whether the sudden change in direction could have been anticipated and prevented somehow.
 
  • Like
Reactions: scottf200