Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety

This site may earn commission on affiliate links.
What's so crazy about that? Drivers already have to look out for construction zones, faded markings, and much more. Autopilot reduces the amount of stuff you need to watch out for, it doesn't increase it.
Help me out here. Regarding AP, what Autosteer is doing for me is centering (absent the corner cases) and TACC is locked on to the car in front of me and controlling speed (again with the exception of edge cases). In addition to everything else, I have to think about disengaging AP when I see one of these situations coming up which happens often because I can see further than AP or the camera. When am I supposed to relax? Maybe this is regional issue but in my part of California we don't have perfect roads and conditions.
 
  • Like
Reactions: Matias
So as expected, the car showed the "take over immediately" notification, beeped at me continuously, and started to slow the car. I jiggled the steering wheel to disable Auto Steer, and the warning went away. Then all of a sudden my Model X which had slowed to about 60 mph took off without warning, accelerating to 75 mph!

This ridiculous bit of illogic was because Traffic-Aware Cruise Control (TACC) was set to 75 mph, so when Auto Steer became disabled after it slowed the car, the Model X tried to quickly accelerate back to the TACC speed. Surprised the heck out of me, and I felt lucky there weren't any other cars around me as I tried to understand what was going on.
This was discussed very recently either earlier in this thread or in another thread. Tl;dr I agree this behavior is just wrong.
That said, the TACC radar, etc., is still functioning so it's not like it's going to blindly run you into the car in front of you.

And why didn't the Model X apply emergency braking after the first collision?
Perhaps because there was no "collision". It sounds like he side-swiped the posts, rather than colliding with them.
I don't see how you can collide with something and continue at full speed. In a collision air bags would have deployed,
which they did not.
 
On an S test drive in June the DS had me put AP on and we drove right thru a construction zone with cones,curves ,traffic etc......
Looking back on it I don't think I would use it in those situations myself. Of course as a test driver I had my hands at 10 and 2 o'clock ,nervous and waiting to take over :) The car did just fine including stopping for traffic from 40 mph and then stop/go albeit for only about 1/4 mile. I disagree Tesla should disengage the AP in these situations as the car had it under control. Of course I have about 2 minutes of AP experience and cannot wait to use it everyday
I had a test drive at the end of October last year and they wouldn't even allow me to use AP. In order to have it demonstrated, they had an approved driver demonstrate it's features. The person I had an appointment with was new and wasn't too familiar with the car so we had 3 people in this drive. As others can attest to, when AP first came out it didn't do a very good job keeping in the middle of the lanes especially on curves. This was later remedied through updated firmware.
 
If I'm not using AP I'm never going to have to override it, but if AP might do something I would never do on my own then I take on
an additional burden of not only driving safely but also ensuring that the (other) nut "at the wheel" doesn't do anything dangerous.
If you are holding the steering wheel then if autopilot tries to steer in an undesired direction, you don't "correct" it, instead you don't allow it. What happens is that you feel a torque (pull) on the steering wheel which you resist (by holding your hands steady) and then auto-steer turns itself off. There is no additional burden. Personally I find that I am frequently torquing the wheel slightly resisting the auto-pilot but not enough to shut it off. I can't be sure, but I think in these cases it actually alters the steering to be what I would do. You could argue that this is just like driving yourself except that in the event you are briefly distracted, the auto pilot is there to help keep you safe.
 
If you are holding the steering wheel then if autopilot tries to steer in an undesired direction, you don't "correct" it, instead you don't allow it. What happens is that you feel a torque (pull) on the steering wheel which you resist (by holding your hands steady) and then auto-steer turns itself off. There is no additional burden. Personally I find that I am frequently torquing the wheel slightly resisting the auto-pilot but not enough to shut it off. I can't be sure, but I think in these cases it actually alters the steering to be what I would do. You could argue that this is just like driving yourself except that in the event you are briefly distracted, the auto pilot is there to help keep you safe.

I do this all the time, AP gets too close to the line, road edge or curb and I correct, but, not enough to disable AP and the car seems to go the direction I want.

Different people have different levels of risk tolerance. Why should I be restricted from using AP, just because someone else is afraid to, under those conditions? My risk tolerance is controlled by the level of my wife's screams.
 
If you are holding the steering wheel then if autopilot tries to steer in an undesired direction, you don't "correct" it, instead you don't allow it.
That requires a considerably greater degree of "holding" than what the AP system requires to continue operating, so clearly they cannot
be expecting the typical user to hold to that degree. Maybe it's just because I'm a control freak, but I find if I have to "correct" something
that's trying to "help" me too often (read: hardly ever) I'd usually rather just do the thing myself. So I guess this all boils down to a matter
of taste and how much you're willing to assist your assistance before it becomes annoying. But when we're talking about safety the
question is really what expectations the system will reasonably create in any significant number of its users.
 
  • Like
Reactions: msnow
About 3 weeks ago, my P90DL did almost an identical manoeuvre as described in OP here... driving along (I was on a divided highway, median in the middle, 2 lanes going each direction, center lines and stripes on the sides, bright, sunny day around noon, etc... it was perfect Autopilot conditions.)

A bright sunny day around noon is absolutely NOT perfect autopilot conditions. A bright day at noon can dramatically reduce contrast between painted lines and the roadway, particularly with concrete and white stripes, or old asphalt (which has faded toward white) and white stripes.

A cloudy day, nighttime, or evening with the sun behind you are all far superior conditions.
 
... Maybe it's just because I'm a control freak, but I find if I have to "correct" something
that's trying to "help" me too often (read: hardly ever) I'd usually rather just do the thing myself. So I guess this all boils down to a matter
of taste and how much you're willing to assist your assistance before it becomes annoying. But when we're talking about safety the question is really what expectations the system will reasonably create in any significant number of its users.
Well certainly you should neither buy nor use AP. As for "expectations", traditional cruise control requires plenty of "correction", and user attention but there doesn't seem to be any groundswell to ban it or demand that it do more than it does. For some reason people want to thrust absurd "expectations" on AP, and when it can't meet them that's a problem. Imagine if those same people had comperable expectations for traditional cruise control! Even Tesla's TACC will accelerator towards a stopped vehicle in front of you then do something akin to a panic stop when it gets close to the stopped car even though it is obvious (to a normal driver) it would need to stop back when it was accelerating and it would be far more sensible to slow gradually in the first place. To be fair there are a some human drivers that like to drive like that.
 
If I'm not using AP I'm never going to have to override it, but if AP might do something I would never do on my own then I take on
an additional burden of not only driving safely but also ensuring that the (other) nut "at the wheel" doesn't do anything dangerous.

If you're alert and ready to take over, the car can't get very far off track or off speed before you intervene. I use AP as often as not in both my S P85D and my X P90D; I use it on divided highways, and when conditions warrant I also use it on city streets and two-lanes. But then, I'm a pilot and totally accustomed to engaging the automation as an assistant, while I remain 'pilot in command'.

It's no different in the Tesla: I know that there are situations that will require my intervention almost every time I drive in AP, that they will arise both predictably and unpredictably, and that so long as I'm right there ready to take over the car won't get far off track when it makes a mistake. I don't believe I've ever so much as crossed over a lane marking before taking over after AP does something untoward. On the other hand I see drivers around me cross lane markings, inadvertently, every day. With AP it really doesn't require much force to take over control of the steering, and no matter what the automation is doing the brake pedal always works normally and instantaneously disables both AP and TACC.

I love the capability Tesla has engineered into the AP and TACC systems, and I would be loath to give up any of it.
 
See previous reply. What, exactly, do you need to "watch out for" about faded markings? Speaking for myself, lane markings are only
one of many clues as to where to drive, and many roads I drive on frequently have little-to-no markings, so they're among the least
significant things in my driving environment.

It's not too uncommon for me to encounter markings so faded that it's not clear where cars are supposed to go. If you're on a little country road then whatever, no problem, but when it happens on a four-lane interstate it gets fun.

Anyway, scanning for faded lines is just part of scanning the road in general, which I'm always doing to look for debris, obstacles, lanes beginning or ending, stopped cars, moving cars, animals, plants, minerals, supernatural phenomena, and hallucinations. Keeping a look out ahead for features I know can cause trouble for Autopilot is fairly minor.

Help me out here. Regarding AP, what Autosteer is doing for me is centering (absent the corner cases) and TACC is locked on to the car in front of me and controlling speed (again with the exception of edge cases). In addition to everything else, I have to think about disengaging AP when I see one of these situations coming up which happens often because I can see further than AP or the camera. When am I supposed to relax? Maybe this is regional issue but in my part of California we don't have perfect roads and conditions.

If you're on roads full of construction and other trouble spots then you're not going to be able to relax very much. Sorry, it's just not made for that. On good roads with proper lanes and no funny business, you can kick back and take on a much more supervisory role. Around here (Northern Virginia) I routinely go 20-30 miles at a time without encountering anything that would cause trouble for Autopilot. I also routinely drive on roads where Autopilot will screw up every 20 seconds, and I don't use it on those. There are plenty of roads in between as well, where Autopilot is useful and takes some load off, but I have to stay much more vigilant because it's likely to encounter trouble.
 
Ok, I am going to go ahead and link to my video. I know people are going to dissect it and tell me what an idiot I am, that I shouldn't be using it in a construction zone (which was just a very brief period of road way, and it was doing fine and I was keeping a watchful eye when the zone started up), the contact cone was moved over more (it wasn't, go back and review the other cones from the entire length), how it was all my fault, etc... so: in before the mind numbing BS of the apologists and conspiracy theorists.

That said, when viewing the video, just watching it, it doesn't seem quite as sudden as it actually was. You can get a sense of this if you count the stripes and how far it drifts over to the right in just 2.5 stripes @ 75 MPH. The video makes it look like a sedate little drift over, but consider it moves about 4 - 6 feet to the right in the span of less than 100'. My hands were on the wheel (due to it being a construction zone) and I was ready to take over and it still caught me by surprise and was a very sudden dive. If anyone wants to complain about my reaction time being crappy, I'll be happy to post my 1/4 mi tickets from my last track day three weeks ago and you can judge my reaction time abilities from there.

Part 1: This is the lead in to the cone contact - the Blackvue will separate the videos when there is a jarring impact, so it separated the actual contact into a different video, which is why it stops right before it hits the cone.

Part 2: This is the part where it actually hits the cone and several seconds before and after. Blackvue separated these.

I would have combined the videos, but I don't want anyone to think that I was trying to pull a fast one and combining two unrelated videos.

Again, just to be clear, I am not blaming Tesla or complaining about AP being unsafe or shitty, etc... It is what it is, and in the grand scheme of things this was a very minor incident. It could have been more serious, sure - but if it was something other than plastic traffic cones, I probably wouldn't have allowed AP to have control at all. But the fact remains, AP failed fairly spectacularly in this instance. There should have been ZERO reason to drift so far to the right so quickly. It's fortunate that the damage was minor and there were no injuries (except to poor Mr. T's right wing)... but what if there had been a car there or something? It would, at the very least, have scared the bejeebers out of that driver and possibly caused an accident, even without contact. This behavior is very similar to what the OP posted and is really the only reason I bring it up - to point out it can, does and has happened. Or at least part of what OP describes does.
 
Ok, I am going to go ahead and link to my video. I know people are going to dissect it and tell me what an idiot I am, that I shouldn't be using it in a construction zone (which was just a very brief period of road way, and it was doing fine and I was keeping a watchful eye when the zone started up), the contact cone was moved over more (it wasn't, go back and review the other cones from the entire length), how it was all my fault, etc... so: in before the mind numbing BS of the apologists and conspiracy theorists.

That said, when viewing the video, just watching it, it doesn't seem quite as sudden as it actually was. You can get a sense of this if you count the stripes and how far it drifts over to the right in just 2.5 stripes @ 75 MPH. The video makes it look like a sedate little drift over, but consider it moves about 4 - 6 feet to the right in the span of less than 100'. My hands were on the wheel (due to it being a construction zone) and I was ready to take over and it still caught me by surprise and was a very sudden dive. If anyone wants to complain about my reaction time being crappy, I'll be happy to post my 1/4 mi tickets from my last track day three weeks ago and you can judge my reaction time abilities from there.

Part 1: This is the lead in to the cone contact - the Blackvue will separate the videos when there is a jarring impact, so it separated the actual contact into a different video, which is why it stops right before it hits the cone.

Part 2: This is the part where it actually hits the cone and several seconds before and after. Blackvue separated these.

I would have combined the videos, but I don't want anyone to think that I was trying to pull a fast one and combining two unrelated videos.

Again, just to be clear, I am not blaming Tesla or complaining about AP being unsafe or shitty, etc... It is what it is, and in the grand scheme of things this was a very minor incident. It could have been more serious, sure - but if it was something other than plastic traffic cones, I probably wouldn't have allowed AP to have control at all. But the fact remains, AP failed fairly spectacularly in this instance. There should have been ZERO reason to drift so far to the right so quickly. It's fortunate that the damage was minor and there were no injuries (except to poor Mr. T's right wing)... but what if there had been a car there or something? It would, at the very least, have scared the bejeebers out of that driver and possibly caused an accident, even without contact. This behavior is very similar to what the OP posted and is really the only reason I bring it up - to point out it can, does and has happened. Or at least part of what OP describes does.

Thanks for posting the video - I guess this just shows that AP does need a bit of a shoulder or something just in case it dives for something. I could see the OP catching a tire on the dirt in that scenario and immediately being pulled to the side and plowing into the guard posts. Even if he had his hands on the wheel, it would have been a difficult scenario to avoid.

I would hope that if there had been a car in the next lane and not cones on the lane-markers, that you would have been able to correct before going too far into the adjacent lane and hitting another vehicle.

I like the suggestion that someone else had that the programming needs to be adjusted to "do nothing" if it gets confused about lines rather than "diving" in either direction.
 
Last edited:
  • Like
Reactions: Naonak
It's fortunate that the damage was minor and there were no injuries (except to poor Mr. T's right wing)... but what if there had been a car there or something? It would, at the very least, have scared the bejeebers out of that driver and possibly caused an accident, even without contact.

Thanks for posting the video. Just a thought though, wouldn't the system have more easily seen a vehicle in the adjacent lane? The traffic cones aren't small, but they aren't vehicle sized either.

Be curious to know what AP decided the cones were, especially as they were centered on the road.
 
  • Like
Reactions: bhzmark and Naonak
I had many too close to call moments with the X autopilot. Sometimes, put the hands on the wheel will not resolve the problem and human correction are needed. We are talking about human life here. Mr. Pang's life was in the line because of a incomplete product, a beta. I support Mr. Pang and his view. I respect human life. So, dislike me all you want. After all, a life is a life. Don't play with life.
 
I like your Bjorn impersonation. But to get it right it's more like sheeeeeeeiiiiiit.

That was actually my son. I cut out the bits a little later where I was bitching about it, haha :)

Thanks for posting the video. Just a thought though, wouldn't the system have more easily seen a vehicle in the adjacent lane? The traffic cones aren't small, but they aren't vehicle sized either.

Be curious to know what AP decided the cones were, especially as they were centered on the road.

Yeah, me too. I would be happy with that if Tesla told me what happened in detail and why. I'd really just like to know what happened or what the car think happened and I'd consider the matter closed on my part.