Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety

This site may earn commission on affiliate links.
But it doesn't shift to park if the seat belt is buckled does it? Maybe Pang wasn't actually in the drivers seat?

I just tested and you are correct. If the seatbelt is buckled and you lift out of the seat then it won't go into park.

So perhaps Pang was sitting on top of his seatbelt in the driver's seat, then climbed out of the car via the passenger door, then came back and opened the door and put it into park? :) I don't see that Tesla argued against his getting out of the vehicle, instead they explained the creep mode. But they never explained how the driver could have been witnessing creep mode in action from outside of the vehicle.
 
Unless you can state a case where someone was holding on to the wheel while on AutoPilot then it veered violently and you were not able to resume control of the wheel, it's still the user's fault. People do stupid things it's a fact of life. Is AutoPilot perfect? Far from it, but when used the way it was intended, the pros far outweighs the cons.
Even if your hands are on the wheel, you sometimes still need to force a necessary correction, which sometimes is quite dangerous.
 
I’m in camp that driver is at fault, period. Not the right road and at night even more so. I do like however that he appeared to get 600 miles on one charge.....want that formula for my next FL to NC trip.

Having said that, AP is not a mature product and if you are using it please be vigilant. I’ve driven over 7K miles with it with great success but yet there have been a few hiccups. Just yesterday, on I595/I75 express lanes heading from Dania Beach SC to Naples my X quickly went violently to left towards barrier and then violently to right veering into right lane before I took control. There were no cars near me and I was in left left out of two well defined lanes.
 
Ok, I am going to go ahead and link to my video. I know people are going to dissect it and tell me what an idiot I am, that I shouldn't be using it in a construction zone (which was just a very brief period of road way, and it was doing fine and I was keeping a watchful eye when the zone started up), the contact cone was moved over more (it wasn't, go back and review the other cones from the entire length), how it was all my fault, etc... so: in before the mind numbing BS of the apologists and conspiracy theorists.

That said, when viewing the video, just watching it, it doesn't seem quite as sudden as it actually was. You can get a sense of this if you count the stripes and how far it drifts over to the right in just 2.5 stripes @ 75 MPH. The video makes it look like a sedate little drift over, but consider it moves about 4 - 6 feet to the right in the span of less than 100'. My hands were on the wheel (due to it being a construction zone) and I was ready to take over and it still caught me by surprise and was a very sudden dive. If anyone wants to complain about my reaction time being crappy, I'll be happy to post my 1/4 mi tickets from my last track day three weeks ago and you can judge my reaction time abilities from there.

Part 1: This is the lead in to the cone contact - the Blackvue will separate the videos when there is a jarring impact, so it separated the actual contact into a different video, which is why it stops right before it hits the cone.

Part 2: This is the part where it actually hits the cone and several seconds before and after. Blackvue separated these.

I would have combined the videos, but I don't want anyone to think that I was trying to pull a fast one and combining two unrelated videos.

Again, just to be clear, I am not blaming Tesla or complaining about AP being unsafe or shitty, etc... It is what it is, and in the grand scheme of things this was a very minor incident. It could have been more serious, sure - but if it was something other than plastic traffic cones, I probably wouldn't have allowed AP to have control at all. But the fact remains, AP failed fairly spectacularly in this instance. There should have been ZERO reason to drift so far to the right so quickly. It's fortunate that the damage was minor and there were no injuries (except to poor Mr. T's right wing)... but what if there had been a car there or something? It would, at the very least, have scared the bejeebers out of that driver and possibly caused an accident, even without contact. This behavior is very similar to what the OP posted and is really the only reason I bring it up - to point out it can, does and has happened. Or at least part of what OP describes does.
I refrained from commenting after your brief description, but seeing the video, I must say that a short while into it, I was mentally trying to slow the car down as the cones started getting closer and closer and things started looking iffy. Personally, I wouldn't be driving that fast through a construction zone, esp. with how narrow the lane becomes given how the cones were laid out.

As others noted, the cones were intruding into your lane (and in the end pretty much all of them were). It is unclear if you hit the cone because of the change in road surface (as in it physically unsettling the car, causing it to drift a bit) or if it was autopilot that got confused by the change in road surface. I would have to get home and look at it slow motion or frame stepping, but it also looks like the car moved over more after the impact (so perhaps most of the move is from this.)
 
  • Like
Reactions: bhzmark
Even if your hands are on the wheel, you sometimes still need to force a necessary correction, which sometimes is quite dangerous.
More of less dangerous than someone texting while driving, playing Pokemon Go while driving? I've never felt when AutoPilot has plotted a course to my demise that the course correction was dangerous. Maybe I'm in the minority. Then again I drive with my hand on the wheel but near my knee. At times AP nags at me to keep hands on wheel even though I am still lightly touching it.
 
I just tested and you are correct. If the seatbelt is buckled and you lift out of the seat then it won't go into park.

So perhaps Pang was sitting on top of his seatbelt in the driver's seat, then climbed out of the car via the passenger door, then came back and opened the door and put it into park? :) I don't see that Tesla argued against his getting out of the vehicle, instead they explained the creep mode. But they never explained how the driver could have been witnessing creep mode in action from outside of the vehicle.
I realize this is a different thing but you don't have to be in the car or in your seat for the motor to be on and the car to move; Summon for example.
 
Thanks for posting the video. I watched it a couple of times. Don't think the cause of the drift is the barrier. There is a stretch of road before the impact that both sides of the lane paint disappeared for one section. That caused a gradual drifting of the vehicle over 4 or 5 lane markers before it hits the cone, which has its bottom inside the lane. From my experience AP camera recognizes lanes by painted dash lines. On I15 outside of Las Vegas there are miles of highway with lane markers without paint. Although the lanes are perfectly visible by eye, camera can't recognize the lanes so AP is disabled. So I hope Tesla can improve their lane recognizing algorithms. In the mean time, when you use AP, leave some margin for error next time.
 
Just a spare thought: apparently autopilot ai should make many decisions in different situations. And generally a problem to correctly apply actions in a new, previously unknown situation is quite hard.

but recently there was developed one approach for making combat aircraft ai
http://www.omicsgroup.org/journals/...ted-air-combat-missions-2167-0374-1000144.pdf

seems quite fit to the task of autopilot to figure out what is 'more correct' in one or another situation, currently it looks like autopilot ai is a bit 'scripted' so not so flexible to distinguish different situations ( apparently moving to the right in video of cone accident - is 'trained' to solve particular problem but this rule is used by ai in situations which have another context - so to distinguish different situations a fast mechanism to make decisions in needed and fuzzy tree approach in paper might be useful for Tesla autopilot ).
 
I disagree. You're making an assumption about the implementation - that sudden AP disengagement would occur. I see at the end of your post that you suggest differently, but why wouldn't that be the original implementation?

Well, I did point out (in the text you snipped) that their initial implementation of autosteer speed restriction (their only geofencing-like restriction thus far) was too simplistic and got it wrong. In multiple ways, actually; I just pointed out the one that was most germane.

As an engineer my focus is always to anticipate as many problems as I can, and to be prepared to reject the first several solutions that come to mind as appealing-but-wrong once I think about them for a while. As Einstein said, "as simple as possible, but no simpler". I do understand that many people practice engineering -- very well -- with a different, more optimistic, point of view and approach. ;-) But my suspicion is that after not getting the speed restrictions quite right (by taking too simplistic an approach), the highly competent and well intentioned folks at Tesla may be proceeding my way; thus the lack of sudden changes to algorithms or behavior, even "obvious" ones (including "obvious" ones I'm personally advocating for).

Here's a simpler way to start - don't allow autosteer to be engaged on prohibited roadways. That's what I was suggesting anyway, and it would have kept the original issue in this thread from occurring.
I like that suggestion. I have a nagging, not-fully-formed worry about parallel service roads (there are a lot of those around here) but I can't quite put my finger on why.

Let me hazard a guess that someone at Tesla has spent more than a few minutes thinking this through. I agree that the time for action is just about now, and I hope they have a good plan.
 
I hope Naonak doesn't mind, but I took his files, joined them and added some markers to help analyse what happened and I thought it might be useful to everyone to help see what might have happened. (slo-mo of the critical part, too).

So that one slab of the road that was repaved, it appeared to either dip the car once the front tires passed it or that's when it veered to the right and caused it to dip?
 
  • Like
Reactions: bhzmark
From my experience AP camera recognizes lanes by painted dash lines. On I15 outside of Las Vegas there are miles of highway with lane markers without paint. Although the lanes are perfectly visible by eye, camera can't recognize the lanes so AP is disabled.

Oooh! That! Yeah, I've seen that. There's a section of beautiful, clearly marked new road I can think of where the lane markers (so far) are reflectors only - no painted lines. Autosteer definitely has trouble there, and it's very surprising.

I think it is getting to be high time I updated that list of situations where AP has trouble. It is out of date both because Tesla has made improvements, and because we've collectively learned more about its limitations.
 
"Following the crash, and once the vehicle had come to rest, the passenger door was opened but the driver door remained closed and the key remained in the vehicle. Since the vehicle had been left in Drive with Creep Mode enabled, the motor continued to rotate. The diagnostic data shows that the driver door was later opened from the outside and the vehicle was shifted to park."

I found the above statement by Tesla curious or maybe I am missing something.
It states that the driver door was never opened from the inside but from the outside and then put into park. According to Mr Pang's letter it states that THEY ran away from the car and returned to put the car in park.
The fact that the driver door was never opened from the inside is curious. So did the driver ever get out of the car?
Seems odd that Tesla would not mention the driver door being opened from the inside given the other info they provided.
Also doesn't the car put itself in park when it detects that the driver gets off the seat? If so creep would have turned off, no?
I have a 3 year old Tesla without AP so some of the new features are unknown to me.

excellent point. The car would have shifted to park when the driver left the seat. More likely the driver remained in the seat until the passenger went around to drivers side and opened the door and then someone shifted the car to park. Driver was passed out or otherwise unconscious? But no injury? hmmm.
 
Conclusion: Auto-pilot is not up to the task of avoiding traffic cones, in the driving lane, at 75 MPH. That doesn't appear to be within the specs as they currently exist, and it shouldn't be asked to do so.

The shift to the right is sudden and coincides with a change in surface, my opinion is, that it was caused by that change in surface, not a decision by auto-steer to move to the right (low confidence in that conclusion).

I agree on both counts. In construction zones steer yourself and slow down.
 
  • Like
Reactions: 30seconds
excellent point. The car would have shifted to park when the driver left the seat. More likely the driver remained in the seat until the passenger went around to drivers side and opened the door and then someone shifted the car to park. Driver was passed out or otherwise unconscious? But no injury? hmmm.
Someone speculated that it could be possible Tesla's latest algorithm doesn't do the auto park unless the seat sensor is empty AND the driver's door is open. This is to address the issue where during parking, the seat sensor makes it so if you shift your weight on the seat (for example to see behind you) it keeps putting the car into park. Anyone want to test this theory (in an X with latest update)?
 
excellent point. The car would have shifted to park when the driver left the seat. More likely the driver remained in the seat until the passenger went around to drivers side and opened the door and then someone shifted the car to park. Driver was passed out or otherwise unconscious? But no injury? hmmm.

The car beeps if you open a door when it's not in Park (for example, if it's in Vehicle Hold, if you have the foot on the brake to keep it stopped, or if it's actually moving). I don't know what exactly is wrong with the benign, generous assumption that the driver was sitting in his seat very shaken up and then he or someone else opened the door; then he heard the beeps and shifted into Park.

I'm a little disturbed by the amount of scorn and innuendo heaped on Pang throughout this. Do I think he made a number of mistakes and poor choices, ultimately causing this crash? I think that is probably the case, yes. But I also think of how I would have felt if I ended up sitting by the side of the road next to the wreck of my beautiful new car like he did, and I find his letter and his point of view understandable. I think you're implying he was drunk. But I see no actual evidence for that, and I for one have driven around Montana very late at night, too fast, quite a few times in my life without being drunk. Ever run your car off the road? I have, and I did in fact sit there like a statue for at least a good solid minute wondering what the heck just happened.

I am also very, very aware of the vast difference between what we rationally decide we should do if some emergency happens in the future, and what we're able to actually do once the emergency happens.

So when I see people dumping on Pang for not instantly braking the vehicle to a halt as soon as the first impact occurred (and before swiping any more posts) I can only say "You know what? Go take a dirt track day in a beater car, let the passenger steer you into some bales, and see whether you can really do what you think he should have done." Because I bet you can't. Most people can't summon the right reaction in an unexpected crisis. You're not a fighter pilot or a race-car driver (apologies to any pilots or F1 drivers reading this) and you probably couldn't do what they do even if you trained like they did -- which neither you nor Pang have. This is in fact one of the great promises of automation -- once it gets there. Which today it's not.

My basic take on this is that the guy probably made a number of bad choices which led to the incident, but that once it started, he didn't really do any better or worse than most of us would have. And speculation about why he was on that road at that time of night is just that -- speculation, and probably wrong besides. Certainly it's unfair and unhelpful, and it is the kind of thing that alienates people who might otherwise be on our side of questions about Autopilot (where by "our" I mean "reasonable, responsible Tesla owners who want to keep using the automation features of their cars in a largely unimpeded way").

Again, put yourself in his shoes. How would you feel if you'd just wrecked your new car? Please, let's have a constructive discussion about what we can learn from this incident but keep that in mind.
 
Last edited:
  • Like
Reactions: Naonak
I'm not saying this in any way to try to call Naonak out, but is there any absolute confirmation that autosteer is engaged? The vehicle speed accelerates from 73 to 75 during the short clip, which certainly is possible, but seems odd. I don't have AP, so am not that familiar with the audible sounds, but believe the sound heard in the clip is from the camera's G-sensor.